Oct 03 14:00:48 crc systemd[1]: Starting Kubernetes Kubelet... Oct 03 14:00:49 crc restorecon[4569]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 14:00:49 crc restorecon[4569]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 03 14:00:50 crc kubenswrapper[4636]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 14:00:50 crc kubenswrapper[4636]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 03 14:00:50 crc kubenswrapper[4636]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 14:00:50 crc kubenswrapper[4636]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 14:00:50 crc kubenswrapper[4636]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 03 14:00:50 crc kubenswrapper[4636]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.565924 4636 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579575 4636 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579617 4636 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579621 4636 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579625 4636 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579638 4636 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579643 4636 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579648 4636 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579653 4636 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579657 4636 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579662 4636 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579699 4636 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579706 4636 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579710 4636 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579714 4636 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579718 4636 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579723 4636 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579731 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579735 4636 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579739 4636 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579742 4636 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579746 4636 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579750 4636 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579754 4636 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579758 4636 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579761 4636 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579766 4636 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579771 4636 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579775 4636 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579780 4636 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579784 4636 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579788 4636 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579793 4636 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579797 4636 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579801 4636 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579807 4636 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579810 4636 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579814 4636 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579818 4636 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579821 4636 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579825 4636 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579831 4636 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579835 4636 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579839 4636 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579843 4636 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579847 4636 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579850 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579856 4636 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579859 4636 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579863 4636 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579867 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579871 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579875 4636 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579880 4636 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579888 4636 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579892 4636 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579896 4636 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579901 4636 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579905 4636 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579909 4636 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579914 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579918 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579922 4636 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579926 4636 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579930 4636 feature_gate.go:330] unrecognized feature gate: Example Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579934 4636 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579946 4636 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579952 4636 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579957 4636 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579962 4636 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579966 4636 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.579970 4636 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580251 4636 flags.go:64] FLAG: --address="0.0.0.0" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580266 4636 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580281 4636 flags.go:64] FLAG: --anonymous-auth="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580292 4636 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580613 4636 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580620 4636 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580628 4636 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580635 4636 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580640 4636 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580644 4636 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580650 4636 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580655 4636 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580660 4636 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580665 4636 flags.go:64] FLAG: --cgroup-root="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580669 4636 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580673 4636 flags.go:64] FLAG: --client-ca-file="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580678 4636 flags.go:64] FLAG: --cloud-config="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580683 4636 flags.go:64] FLAG: --cloud-provider="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580687 4636 flags.go:64] FLAG: --cluster-dns="[]" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580697 4636 flags.go:64] FLAG: --cluster-domain="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580701 4636 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580707 4636 flags.go:64] FLAG: --config-dir="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580711 4636 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580717 4636 flags.go:64] FLAG: --container-log-max-files="5" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580725 4636 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580731 4636 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580737 4636 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580742 4636 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580747 4636 flags.go:64] FLAG: --contention-profiling="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580752 4636 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580757 4636 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580761 4636 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580768 4636 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580793 4636 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580797 4636 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580802 4636 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580806 4636 flags.go:64] FLAG: --enable-load-reader="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580811 4636 flags.go:64] FLAG: --enable-server="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580816 4636 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580825 4636 flags.go:64] FLAG: --event-burst="100" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580833 4636 flags.go:64] FLAG: --event-qps="50" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580839 4636 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580844 4636 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580851 4636 flags.go:64] FLAG: --eviction-hard="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580858 4636 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580864 4636 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580869 4636 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580874 4636 flags.go:64] FLAG: --eviction-soft="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580879 4636 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580883 4636 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580888 4636 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580892 4636 flags.go:64] FLAG: --experimental-mounter-path="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580896 4636 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580900 4636 flags.go:64] FLAG: --fail-swap-on="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580906 4636 flags.go:64] FLAG: --feature-gates="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580912 4636 flags.go:64] FLAG: --file-check-frequency="20s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580918 4636 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580923 4636 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580929 4636 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580935 4636 flags.go:64] FLAG: --healthz-port="10248" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580940 4636 flags.go:64] FLAG: --help="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580948 4636 flags.go:64] FLAG: --hostname-override="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580954 4636 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580960 4636 flags.go:64] FLAG: --http-check-frequency="20s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580965 4636 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580969 4636 flags.go:64] FLAG: --image-credential-provider-config="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580974 4636 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580979 4636 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580984 4636 flags.go:64] FLAG: --image-service-endpoint="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580989 4636 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580994 4636 flags.go:64] FLAG: --kube-api-burst="100" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.580999 4636 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581004 4636 flags.go:64] FLAG: --kube-api-qps="50" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581008 4636 flags.go:64] FLAG: --kube-reserved="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581019 4636 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581028 4636 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581033 4636 flags.go:64] FLAG: --kubelet-cgroups="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581037 4636 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581046 4636 flags.go:64] FLAG: --lock-file="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581051 4636 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581055 4636 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581060 4636 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581072 4636 flags.go:64] FLAG: --log-json-split-stream="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581086 4636 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581091 4636 flags.go:64] FLAG: --log-text-split-stream="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581120 4636 flags.go:64] FLAG: --logging-format="text" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581135 4636 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581140 4636 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581145 4636 flags.go:64] FLAG: --manifest-url="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581150 4636 flags.go:64] FLAG: --manifest-url-header="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581158 4636 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581163 4636 flags.go:64] FLAG: --max-open-files="1000000" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581168 4636 flags.go:64] FLAG: --max-pods="110" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581174 4636 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581179 4636 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581184 4636 flags.go:64] FLAG: --memory-manager-policy="None" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581190 4636 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581195 4636 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581200 4636 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581205 4636 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581223 4636 flags.go:64] FLAG: --node-status-max-images="50" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581228 4636 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581232 4636 flags.go:64] FLAG: --oom-score-adj="-999" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581238 4636 flags.go:64] FLAG: --pod-cidr="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581243 4636 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581255 4636 flags.go:64] FLAG: --pod-manifest-path="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581260 4636 flags.go:64] FLAG: --pod-max-pids="-1" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581266 4636 flags.go:64] FLAG: --pods-per-core="0" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581271 4636 flags.go:64] FLAG: --port="10250" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581276 4636 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581281 4636 flags.go:64] FLAG: --provider-id="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581285 4636 flags.go:64] FLAG: --qos-reserved="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581289 4636 flags.go:64] FLAG: --read-only-port="10255" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581294 4636 flags.go:64] FLAG: --register-node="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581299 4636 flags.go:64] FLAG: --register-schedulable="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581304 4636 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581315 4636 flags.go:64] FLAG: --registry-burst="10" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581320 4636 flags.go:64] FLAG: --registry-qps="5" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581327 4636 flags.go:64] FLAG: --reserved-cpus="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581332 4636 flags.go:64] FLAG: --reserved-memory="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581338 4636 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581343 4636 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581347 4636 flags.go:64] FLAG: --rotate-certificates="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581352 4636 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581357 4636 flags.go:64] FLAG: --runonce="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581361 4636 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581365 4636 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581370 4636 flags.go:64] FLAG: --seccomp-default="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581374 4636 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581378 4636 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581383 4636 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581388 4636 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581393 4636 flags.go:64] FLAG: --storage-driver-password="root" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581397 4636 flags.go:64] FLAG: --storage-driver-secure="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581402 4636 flags.go:64] FLAG: --storage-driver-table="stats" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581407 4636 flags.go:64] FLAG: --storage-driver-user="root" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581413 4636 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581420 4636 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581426 4636 flags.go:64] FLAG: --system-cgroups="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581431 4636 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581441 4636 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581446 4636 flags.go:64] FLAG: --tls-cert-file="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581450 4636 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581459 4636 flags.go:64] FLAG: --tls-min-version="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581463 4636 flags.go:64] FLAG: --tls-private-key-file="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581467 4636 flags.go:64] FLAG: --topology-manager-policy="none" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581471 4636 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581476 4636 flags.go:64] FLAG: --topology-manager-scope="container" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581480 4636 flags.go:64] FLAG: --v="2" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581492 4636 flags.go:64] FLAG: --version="false" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581501 4636 flags.go:64] FLAG: --vmodule="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581506 4636 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581511 4636 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581657 4636 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581663 4636 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581668 4636 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581672 4636 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581676 4636 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581679 4636 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581684 4636 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581688 4636 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581693 4636 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581697 4636 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581701 4636 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581706 4636 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581710 4636 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581714 4636 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581718 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581722 4636 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581726 4636 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581731 4636 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581735 4636 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581739 4636 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581743 4636 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581747 4636 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581750 4636 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581754 4636 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581758 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581761 4636 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581765 4636 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581769 4636 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581774 4636 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581778 4636 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581782 4636 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581786 4636 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581791 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581794 4636 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581800 4636 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581803 4636 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581808 4636 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581813 4636 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581819 4636 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581824 4636 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581830 4636 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581835 4636 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581839 4636 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581844 4636 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581849 4636 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581854 4636 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581859 4636 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581864 4636 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581873 4636 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581878 4636 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581882 4636 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581888 4636 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581892 4636 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581896 4636 feature_gate.go:330] unrecognized feature gate: Example Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581901 4636 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581905 4636 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581910 4636 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581915 4636 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581919 4636 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581923 4636 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581928 4636 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581933 4636 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581937 4636 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581941 4636 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581944 4636 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581949 4636 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581954 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581958 4636 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581962 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581966 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.581970 4636 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.581977 4636 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.591448 4636 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.591497 4636 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591594 4636 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591614 4636 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591620 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591627 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591635 4636 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591642 4636 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591649 4636 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591656 4636 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591662 4636 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591669 4636 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591676 4636 feature_gate.go:330] unrecognized feature gate: Example Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591682 4636 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591687 4636 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591692 4636 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591697 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591702 4636 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591708 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591713 4636 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591719 4636 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591724 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591729 4636 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591734 4636 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591739 4636 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591744 4636 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591748 4636 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591754 4636 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591759 4636 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591763 4636 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591768 4636 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591773 4636 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591778 4636 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591783 4636 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591788 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591792 4636 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591799 4636 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591804 4636 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591809 4636 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591814 4636 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591819 4636 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591824 4636 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591828 4636 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591833 4636 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591838 4636 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591843 4636 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591847 4636 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591852 4636 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591857 4636 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591862 4636 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591867 4636 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591872 4636 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591877 4636 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591884 4636 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591893 4636 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591900 4636 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591905 4636 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591911 4636 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591917 4636 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591924 4636 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591947 4636 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591952 4636 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591958 4636 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591964 4636 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591969 4636 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591974 4636 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591979 4636 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591984 4636 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591989 4636 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591994 4636 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.591999 4636 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592004 4636 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592010 4636 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.592021 4636 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592236 4636 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592251 4636 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592256 4636 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592262 4636 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592267 4636 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592272 4636 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592277 4636 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592282 4636 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592288 4636 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592293 4636 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592298 4636 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592303 4636 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592308 4636 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592313 4636 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592317 4636 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592322 4636 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592327 4636 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592332 4636 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592337 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592342 4636 feature_gate.go:330] unrecognized feature gate: Example Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592346 4636 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592354 4636 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592361 4636 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592368 4636 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592373 4636 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592379 4636 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592385 4636 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592391 4636 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592397 4636 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592402 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592408 4636 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592413 4636 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592418 4636 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592423 4636 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592430 4636 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592436 4636 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592442 4636 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592449 4636 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592455 4636 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592459 4636 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592466 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592472 4636 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592479 4636 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592485 4636 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592490 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592495 4636 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592500 4636 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592505 4636 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592511 4636 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592516 4636 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592520 4636 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592527 4636 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592532 4636 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592538 4636 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592542 4636 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592547 4636 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592552 4636 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592557 4636 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592562 4636 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592567 4636 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592571 4636 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592576 4636 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592581 4636 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592586 4636 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592591 4636 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592597 4636 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592602 4636 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592608 4636 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592614 4636 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592618 4636 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.592624 4636 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.592633 4636 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.593855 4636 server.go:940] "Client rotation is on, will bootstrap in background" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.598607 4636 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.598740 4636 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.601822 4636 server.go:997] "Starting client certificate rotation" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.601860 4636 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.602141 4636 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-08 16:14:58.612937785 +0000 UTC Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.602315 4636 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 866h14m8.010626661s for next certificate rotation Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.628433 4636 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.630242 4636 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.643841 4636 log.go:25] "Validated CRI v1 runtime API" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.680161 4636 log.go:25] "Validated CRI v1 image API" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.682121 4636 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.687145 4636 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-03-13-55-20-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.687174 4636 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.699152 4636 manager.go:217] Machine: {Timestamp:2025-10-03 14:00:50.697516615 +0000 UTC m=+0.556242882 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5822d918-3835-42d5-a2d8-0c9b2af0c4b1 BootID:c9943c44-af0e-4d0e-8d9b-fbf9dab653b1 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:56:84:fe Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:56:84:fe Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:81:7d:dd Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2d:47:5b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:57:e2:b1 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f6:86:a1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:c2:2a:98:bb:2d:a7 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9e:c1:ce:3c:78:f4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.699381 4636 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.699621 4636 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.700391 4636 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.700823 4636 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.700895 4636 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.701965 4636 topology_manager.go:138] "Creating topology manager with none policy" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.702029 4636 container_manager_linux.go:303] "Creating device plugin manager" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.703079 4636 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.703149 4636 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.703666 4636 state_mem.go:36] "Initialized new in-memory state store" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.703830 4636 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.710769 4636 kubelet.go:418] "Attempting to sync node with API server" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.710804 4636 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.710828 4636 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.710846 4636 kubelet.go:324] "Adding apiserver pod source" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.710863 4636 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.717350 4636 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.719358 4636 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.719345 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.719587 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.719670 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.719799 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.720988 4636 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723021 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723054 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723065 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723074 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723090 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723121 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723130 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723148 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723161 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723171 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723186 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.723197 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.724826 4636 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.725484 4636 server.go:1280] "Started kubelet" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.726008 4636 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.726773 4636 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.726941 4636 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.727048 4636 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.727771 4636 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.727796 4636 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.727863 4636 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 08:18:34.815947865 +0000 UTC Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.727959 4636 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1650h17m44.087995059s for next certificate rotation Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.728249 4636 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.728265 4636 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.728339 4636 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.729606 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:50 crc systemd[1]: Started Kubernetes Kubelet. Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.729970 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.730080 4636 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.732186 4636 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.24:6443: connect: connection refused" interval="200ms" Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.731979 4636 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.24:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186afff951c56753 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 14:00:50.725431123 +0000 UTC m=+0.584157380,LastTimestamp:2025-10-03 14:00:50.725431123 +0000 UTC m=+0.584157380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.734619 4636 factory.go:153] Registering CRI-O factory Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.734917 4636 factory.go:221] Registration of the crio container factory successfully Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.734973 4636 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.734982 4636 factory.go:55] Registering systemd factory Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.734989 4636 factory.go:221] Registration of the systemd container factory successfully Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.735005 4636 factory.go:103] Registering Raw factory Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.735018 4636 manager.go:1196] Started watching for new ooms in manager Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.734873 4636 server.go:460] "Adding debug handlers to kubelet server" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.735475 4636 manager.go:319] Starting recovery of all containers Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757005 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757148 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757185 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757216 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757248 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757279 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757309 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757340 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757378 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757405 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757434 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757465 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757497 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757542 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757574 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757610 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757642 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757672 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757701 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757730 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757758 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757798 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757826 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757855 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757921 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757951 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.757987 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758019 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758047 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758074 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758194 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758237 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758265 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758294 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758326 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758355 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758385 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758414 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758442 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758473 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758501 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758529 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758560 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758591 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758653 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758701 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758732 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758766 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758794 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758824 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758854 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758885 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.758996 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759044 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759079 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759147 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759182 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759214 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759245 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759278 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759311 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759341 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759372 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759406 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759434 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759468 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759498 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759528 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759573 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759603 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759638 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759668 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759699 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759741 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759774 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759820 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759874 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759908 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759953 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.759989 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760030 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760064 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760163 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760197 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760235 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760276 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760307 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760338 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760373 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760405 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760440 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760475 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760506 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760537 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760567 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760601 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760631 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760660 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760693 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760730 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760761 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760793 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760823 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760859 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.760962 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761013 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761053 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761124 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761161 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761200 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761237 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761279 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761324 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761358 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761391 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761426 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761459 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761493 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761525 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761558 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761588 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761622 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761657 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761689 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761721 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761752 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761780 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761817 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761943 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.761995 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.762029 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.762058 4636 manager.go:324] Recovery completed Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.762066 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.763834 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.763935 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.764006 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.764084 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.764192 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.764260 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.764336 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.764417 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.764494 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.764572 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.764652 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.764823 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.764917 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765000 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765069 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765173 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765254 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765332 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765416 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765498 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765569 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765659 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765736 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765803 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765858 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.765949 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.766039 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.766828 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.766919 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.766994 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.767084 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.767218 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.767304 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.767402 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.767494 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.767581 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.767650 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774518 4636 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774596 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774628 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774654 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774679 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774702 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774725 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774747 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774764 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774781 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774798 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774821 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774846 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774866 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774888 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774913 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774933 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774951 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774972 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.774990 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775013 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775034 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775055 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775079 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775122 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775140 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775160 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775180 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775199 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775200 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775351 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775377 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775401 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775424 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775445 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775468 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775490 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775516 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775536 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775558 4636 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775579 4636 reconstruct.go:97] "Volume reconstruction finished" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.775593 4636 reconciler.go:26] "Reconciler: start to sync state" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.779407 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.779460 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.779478 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.780355 4636 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.780381 4636 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.780407 4636 state_mem.go:36] "Initialized new in-memory state store" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.790763 4636 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.792431 4636 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.792462 4636 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.792485 4636 kubelet.go:2335] "Starting kubelet main sync loop" Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.792535 4636 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 03 14:00:50 crc kubenswrapper[4636]: W1003 14:00:50.795988 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.796062 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.799412 4636 policy_none.go:49] "None policy: Start" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.802387 4636 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.802418 4636 state_mem.go:35] "Initializing new in-memory state store" Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.831074 4636 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.847311 4636 manager.go:334] "Starting Device Plugin manager" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.847550 4636 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.847569 4636 server.go:79] "Starting device plugin registration server" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.848185 4636 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.848203 4636 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.848401 4636 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.848475 4636 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.848484 4636 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.856665 4636 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.893276 4636 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.893380 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.895368 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.895449 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.895464 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.895788 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.895966 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.896015 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.897216 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.897236 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.897246 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.897256 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.897281 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.897290 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.898088 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.900227 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.901022 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.903141 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.903176 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.903190 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.903182 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.903234 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.903247 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.903479 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.904409 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.904463 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.904577 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.904633 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.904649 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.904881 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.904990 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.905030 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.905287 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.905319 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.905334 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.905961 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.905992 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.906006 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.906724 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.906822 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.906888 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.907187 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.907283 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.908165 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.908265 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.908351 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.933120 4636 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.24:6443: connect: connection refused" interval="400ms" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.949138 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.950693 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.950850 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.950930 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.951028 4636 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:00:50 crc kubenswrapper[4636]: E1003 14:00:50.951820 4636 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.24:6443: connect: connection refused" node="crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.978708 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.978772 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.978825 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.978863 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.978887 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.978930 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.978951 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.978972 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.978997 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.979023 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.979053 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.979077 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.979246 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.979327 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:50 crc kubenswrapper[4636]: I1003 14:00:50.979379 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080587 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080647 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080678 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080726 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080741 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080778 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080789 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080749 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080825 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080833 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080844 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080867 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080888 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080913 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080891 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080945 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080963 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080981 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080993 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.080997 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.081030 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.081051 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.081072 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.081150 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.081182 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.081195 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.081032 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.081222 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.081238 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.081244 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.152863 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.154783 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.154840 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.154851 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.154875 4636 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:00:51 crc kubenswrapper[4636]: E1003 14:00:51.155357 4636 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.24:6443: connect: connection refused" node="crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.239303 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.254691 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.273258 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: W1003 14:00:51.282058 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-8f20f75fee886eea8e9f3b8930f1347b163939048e5c155df2f72d449ce998df WatchSource:0}: Error finding container 8f20f75fee886eea8e9f3b8930f1347b163939048e5c155df2f72d449ce998df: Status 404 returned error can't find the container with id 8f20f75fee886eea8e9f3b8930f1347b163939048e5c155df2f72d449ce998df Oct 03 14:00:51 crc kubenswrapper[4636]: W1003 14:00:51.285601 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d62f6f3f5b6a84b7c5e91bc7fc01950326bcb10d66b02372b394e65e8f371002 WatchSource:0}: Error finding container d62f6f3f5b6a84b7c5e91bc7fc01950326bcb10d66b02372b394e65e8f371002: Status 404 returned error can't find the container with id d62f6f3f5b6a84b7c5e91bc7fc01950326bcb10d66b02372b394e65e8f371002 Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.293335 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.298584 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:00:51 crc kubenswrapper[4636]: E1003 14:00:51.334304 4636 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.24:6443: connect: connection refused" interval="800ms" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.555745 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.560307 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.560375 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.560398 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.560452 4636 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:00:51 crc kubenswrapper[4636]: E1003 14:00:51.561369 4636 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.24:6443: connect: connection refused" node="crc" Oct 03 14:00:51 crc kubenswrapper[4636]: W1003 14:00:51.595304 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:51 crc kubenswrapper[4636]: E1003 14:00:51.595435 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.728445 4636 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.798118 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"92e896b8347e4a91e242c13115e24ad0cec9257407556a371cbad9137ab31d93"} Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.798869 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bca7c83b5fea52947434a21de3922fd6a5b92b42cf60654ff08d3fa8083a5f76"} Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.799753 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fa6ca459887ce37343bb6fa3ee462563d28fbbe61beb6521a690e6a3170507cb"} Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.801458 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d62f6f3f5b6a84b7c5e91bc7fc01950326bcb10d66b02372b394e65e8f371002"} Oct 03 14:00:51 crc kubenswrapper[4636]: I1003 14:00:51.803149 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8f20f75fee886eea8e9f3b8930f1347b163939048e5c155df2f72d449ce998df"} Oct 03 14:00:52 crc kubenswrapper[4636]: W1003 14:00:52.031263 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:52 crc kubenswrapper[4636]: E1003 14:00:52.031376 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:52 crc kubenswrapper[4636]: E1003 14:00:52.135477 4636 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.24:6443: connect: connection refused" interval="1.6s" Oct 03 14:00:52 crc kubenswrapper[4636]: W1003 14:00:52.213979 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:52 crc kubenswrapper[4636]: E1003 14:00:52.214083 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:52 crc kubenswrapper[4636]: W1003 14:00:52.322631 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:52 crc kubenswrapper[4636]: E1003 14:00:52.322707 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.361778 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.363152 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.363279 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.363291 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.363344 4636 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:00:52 crc kubenswrapper[4636]: E1003 14:00:52.364118 4636 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.24:6443: connect: connection refused" node="crc" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.729466 4636 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.809247 4636 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c" exitCode=0 Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.809445 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c"} Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.809501 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.811204 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.811249 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.811261 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.814122 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f"} Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.814183 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08"} Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.814196 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3"} Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.814182 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.814207 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876"} Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.814351 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.815298 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.815336 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.815346 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.815725 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.815755 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.815766 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.817421 4636 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89" exitCode=0 Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.817504 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89"} Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.817581 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.818874 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.818922 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.818948 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.820743 4636 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="99646a04d7ebb8fe9b23012d70c0a8a05cbcd8c7cbc71cc4e276063575920152" exitCode=0 Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.820791 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"99646a04d7ebb8fe9b23012d70c0a8a05cbcd8c7cbc71cc4e276063575920152"} Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.820905 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.829561 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.830915 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.831287 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.829726 4636 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5de2406a1c7eb859a4433c77e351aeefe545517d1fe3bf914419b6db29a6a44c" exitCode=0 Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.829758 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5de2406a1c7eb859a4433c77e351aeefe545517d1fe3bf914419b6db29a6a44c"} Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.829885 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.840117 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.840149 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:52 crc kubenswrapper[4636]: I1003 14:00:52.840162 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.703516 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.728400 4636 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:53 crc kubenswrapper[4636]: E1003 14:00:53.736202 4636 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.24:6443: connect: connection refused" interval="3.2s" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.837409 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"449f08b53e055181f2144302dcff762922e28aaa605b9256e9c0e0d4b2027413"} Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.837547 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.838747 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.838797 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.838814 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.840316 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d9281dd902e3637bd0ecf7f8918a383fe4b3c03afa7f76898c8e0c6a6ed471ba"} Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.840361 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a"} Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.840376 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25"} Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.840388 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5"} Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.840396 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec"} Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.840427 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.841516 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.841622 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.841660 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.843623 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1"} Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.843669 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b"} Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.843685 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb"} Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.843920 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.845475 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.845517 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.845544 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.845729 4636 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bad4cdce98deaf19c3e62a0bad33bf255718bfc9f3c0872d9d90332a83c86bba" exitCode=0 Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.845757 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bad4cdce98deaf19c3e62a0bad33bf255718bfc9f3c0872d9d90332a83c86bba"} Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.845885 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.845894 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.846827 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.846987 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.847072 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.846849 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.847383 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.847453 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:53 crc kubenswrapper[4636]: W1003 14:00:53.872801 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:53 crc kubenswrapper[4636]: E1003 14:00:53.872897 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.964564 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.965759 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.965799 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.965812 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:53 crc kubenswrapper[4636]: I1003 14:00:53.965839 4636 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:00:53 crc kubenswrapper[4636]: E1003 14:00:53.966347 4636 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.24:6443: connect: connection refused" node="crc" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.106802 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.122428 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:54 crc kubenswrapper[4636]: W1003 14:00:54.143159 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:54 crc kubenswrapper[4636]: E1003 14:00:54.143271 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:54 crc kubenswrapper[4636]: W1003 14:00:54.589432 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:54 crc kubenswrapper[4636]: E1003 14:00:54.589516 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.671117 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.671226 4636 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.671272 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.727799 4636 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.850181 4636 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b6d7a189fa3e1baec50c682556ca6f00c21773804bd51f2e34a599f39c801fd8" exitCode=0 Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.850274 4636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.850284 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.850298 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.850293 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b6d7a189fa3e1baec50c682556ca6f00c21773804bd51f2e34a599f39c801fd8"} Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.850397 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.850415 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.850482 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851065 4636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851119 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851230 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851255 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851265 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851228 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851290 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851303 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851965 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851989 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851965 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.851966 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.852013 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.852001 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.852024 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.852014 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:54 crc kubenswrapper[4636]: I1003 14:00:54.852058 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:55 crc kubenswrapper[4636]: W1003 14:00:55.029813 4636 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.24:6443: connect: connection refused Oct 03 14:00:55 crc kubenswrapper[4636]: E1003 14:00:55.029875 4636 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.24:6443: connect: connection refused" logger="UnhandledError" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.076838 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.853918 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.855610 4636 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d9281dd902e3637bd0ecf7f8918a383fe4b3c03afa7f76898c8e0c6a6ed471ba" exitCode=255 Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.855660 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d9281dd902e3637bd0ecf7f8918a383fe4b3c03afa7f76898c8e0c6a6ed471ba"} Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.855901 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.857247 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.857271 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.857281 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.857751 4636 scope.go:117] "RemoveContainer" containerID="d9281dd902e3637bd0ecf7f8918a383fe4b3c03afa7f76898c8e0c6a6ed471ba" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.859453 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c4f38d1718147c948d0c47a36c036bef42c63510f7f88a4e5a64016afcd4c0ca"} Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.859477 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c5e6ba1eff8ce1262a298ccec613cc114e0b4d56cd27980d398de9f85418daa9"} Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.859494 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a0926f422e599406a516b61935ce9563022c3e9e7ad33a9ca950f1c058ced436"} Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.859503 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9c57804a5fe21cefac57519b8f918bf23674bdfd79a85b0ce4656c06d1fe147e"} Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.859511 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"55e0ff2beb74f7fbf7fa9d399206c8f11329cbcf607af0cc6dbeb6ff1a80b473"} Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.859526 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.859558 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.859600 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.860491 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.860504 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.860519 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.860523 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.860528 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.860533 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.860529 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.860564 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:55 crc kubenswrapper[4636]: I1003 14:00:55.860573 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.236704 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.703940 4636 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.703995 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.863126 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.865357 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.865860 4636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.865943 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.866272 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf"} Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.866628 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.866664 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.866674 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.866903 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.866974 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:56 crc kubenswrapper[4636]: I1003 14:00:56.867036 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.166504 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.167851 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.167880 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.167889 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.167909 4636 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.867445 4636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.867934 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.868655 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.868769 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.868867 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.872879 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.873036 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.873855 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.873979 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:57 crc kubenswrapper[4636]: I1003 14:00:57.874070 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:58 crc kubenswrapper[4636]: I1003 14:00:58.667466 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 03 14:00:58 crc kubenswrapper[4636]: I1003 14:00:58.869379 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:58 crc kubenswrapper[4636]: I1003 14:00:58.870395 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:58 crc kubenswrapper[4636]: I1003 14:00:58.870425 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:58 crc kubenswrapper[4636]: I1003 14:00:58.870435 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:00:59 crc kubenswrapper[4636]: I1003 14:00:59.403260 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:00:59 crc kubenswrapper[4636]: I1003 14:00:59.403643 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:00:59 crc kubenswrapper[4636]: I1003 14:00:59.404951 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:00:59 crc kubenswrapper[4636]: I1003 14:00:59.404988 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:00:59 crc kubenswrapper[4636]: I1003 14:00:59.405002 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:00 crc kubenswrapper[4636]: I1003 14:01:00.525248 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:01:00 crc kubenswrapper[4636]: I1003 14:01:00.525434 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:01:00 crc kubenswrapper[4636]: I1003 14:01:00.526520 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:00 crc kubenswrapper[4636]: I1003 14:01:00.526550 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:00 crc kubenswrapper[4636]: I1003 14:01:00.526558 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:00 crc kubenswrapper[4636]: E1003 14:01:00.856786 4636 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 14:01:02 crc kubenswrapper[4636]: I1003 14:01:02.638309 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:01:02 crc kubenswrapper[4636]: I1003 14:01:02.638436 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:01:02 crc kubenswrapper[4636]: I1003 14:01:02.639606 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:02 crc kubenswrapper[4636]: I1003 14:01:02.639720 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:02 crc kubenswrapper[4636]: I1003 14:01:02.639790 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:05 crc kubenswrapper[4636]: I1003 14:01:05.729071 4636 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 03 14:01:05 crc kubenswrapper[4636]: I1003 14:01:05.998112 4636 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 14:01:05 crc kubenswrapper[4636]: I1003 14:01:05.998178 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 14:01:06 crc kubenswrapper[4636]: I1003 14:01:06.004578 4636 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 14:01:06 crc kubenswrapper[4636]: I1003 14:01:06.004647 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 14:01:06 crc kubenswrapper[4636]: I1003 14:01:06.704984 4636 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 14:01:06 crc kubenswrapper[4636]: I1003 14:01:06.705116 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.049736 4636 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.049796 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.699216 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.699407 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.700509 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.700553 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.700570 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.710656 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.890092 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.890946 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.890978 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:08 crc kubenswrapper[4636]: I1003 14:01:08.890989 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.403983 4636 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.404050 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.676559 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.676892 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.677258 4636 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.677320 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.678055 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.678085 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.678094 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.680993 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.892427 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.892772 4636 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.892809 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.893412 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.893479 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:09 crc kubenswrapper[4636]: I1003 14:01:09.893492 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:10 crc kubenswrapper[4636]: E1003 14:01:10.856878 4636 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 14:01:10 crc kubenswrapper[4636]: E1003 14:01:10.980942 4636 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 03 14:01:10 crc kubenswrapper[4636]: I1003 14:01:10.984526 4636 trace.go:236] Trace[555056773]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 14:00:59.909) (total time: 11075ms): Oct 03 14:01:10 crc kubenswrapper[4636]: Trace[555056773]: ---"Objects listed" error: 11075ms (14:01:10.984) Oct 03 14:01:10 crc kubenswrapper[4636]: Trace[555056773]: [11.075339493s] [11.075339493s] END Oct 03 14:01:10 crc kubenswrapper[4636]: I1003 14:01:10.984561 4636 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 03 14:01:10 crc kubenswrapper[4636]: I1003 14:01:10.985467 4636 trace.go:236] Trace[1520044949]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 14:01:00.979) (total time: 10005ms): Oct 03 14:01:10 crc kubenswrapper[4636]: Trace[1520044949]: ---"Objects listed" error: 10005ms (14:01:10.985) Oct 03 14:01:10 crc kubenswrapper[4636]: Trace[1520044949]: [10.005687498s] [10.005687498s] END Oct 03 14:01:10 crc kubenswrapper[4636]: I1003 14:01:10.985490 4636 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 03 14:01:10 crc kubenswrapper[4636]: E1003 14:01:10.986386 4636 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 03 14:01:10 crc kubenswrapper[4636]: I1003 14:01:10.987466 4636 trace.go:236] Trace[2000990130]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 14:01:00.064) (total time: 10922ms): Oct 03 14:01:10 crc kubenswrapper[4636]: Trace[2000990130]: ---"Objects listed" error: 10922ms (14:01:10.987) Oct 03 14:01:10 crc kubenswrapper[4636]: Trace[2000990130]: [10.922867301s] [10.922867301s] END Oct 03 14:01:10 crc kubenswrapper[4636]: I1003 14:01:10.987492 4636 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 03 14:01:10 crc kubenswrapper[4636]: I1003 14:01:10.987840 4636 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 03 14:01:10 crc kubenswrapper[4636]: I1003 14:01:10.988837 4636 trace.go:236] Trace[198325695]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 14:00:57.437) (total time: 13551ms): Oct 03 14:01:10 crc kubenswrapper[4636]: Trace[198325695]: ---"Objects listed" error: 13550ms (14:01:10.988) Oct 03 14:01:10 crc kubenswrapper[4636]: Trace[198325695]: [13.5512902s] [13.5512902s] END Oct 03 14:01:10 crc kubenswrapper[4636]: I1003 14:01:10.988861 4636 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.726203 4636 apiserver.go:52] "Watching apiserver" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.729278 4636 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.729594 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.729965 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.729965 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.730214 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.730294 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.730336 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.730347 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.730386 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.730386 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.730389 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.732811 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.732826 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.733106 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.733118 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.733105 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.733714 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.733843 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.733935 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.734035 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.765666 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.777768 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.787448 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.799417 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.810007 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.822496 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.829290 4636 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.832425 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.891982 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892032 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892058 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892081 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892135 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892158 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892183 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892204 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892229 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892251 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892273 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892296 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892318 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892341 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892362 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892388 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892412 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892436 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892457 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892481 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892504 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892529 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892358 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892558 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892373 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892532 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892535 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892736 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892751 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892781 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892915 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892927 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893001 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893014 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893050 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.892550 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893146 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893178 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893203 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893228 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893252 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893275 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893300 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893328 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893379 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893400 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893421 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893443 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893462 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893481 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893502 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893522 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893626 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893655 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893678 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893700 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893722 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893742 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893760 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893778 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893837 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893858 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893899 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893922 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893964 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893987 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894009 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893202 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893359 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893489 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893595 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894055 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893767 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893834 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.893865 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894007 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894026 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894189 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894221 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894251 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894031 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894295 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894319 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894366 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894391 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894395 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894413 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894435 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894458 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894466 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894481 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894504 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894514 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894527 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894529 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894554 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894552 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894584 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894611 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894653 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894657 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894705 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894710 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894749 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894776 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894807 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894829 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894855 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894881 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894906 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894930 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894956 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894979 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.895002 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.895023 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.895044 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.895067 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.895089 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894779 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894797 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894824 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894946 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.894970 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.895160 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.895246 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:01:12.395224878 +0000 UTC m=+22.253951125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898669 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898706 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898727 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898749 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898769 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898793 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898813 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898833 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898854 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898877 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898896 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898921 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898950 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898971 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898991 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899011 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899043 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899081 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899128 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899149 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899174 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899227 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899255 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899270 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899287 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899310 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899331 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899349 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899370 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899389 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899408 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899429 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899451 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.899471 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.900857 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.900930 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901001 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901071 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901183 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901258 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901337 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901605 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901680 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901884 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902000 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902138 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902742 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902795 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902831 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902871 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902908 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902936 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902971 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902998 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903026 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903055 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903084 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903133 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903165 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903190 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903218 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903248 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903275 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903326 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903353 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903386 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903413 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903437 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903465 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903489 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903508 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903531 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903555 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903577 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903602 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903627 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903649 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903671 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903696 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903717 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903739 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903761 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903780 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903801 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903825 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903850 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903872 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903892 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903917 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903940 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903966 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903989 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.904026 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.904053 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.904256 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.904283 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.906222 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.906625 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907186 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907280 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907366 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.908802 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.908922 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909003 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909072 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909183 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909269 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909348 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909418 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909493 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909571 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909702 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909807 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909924 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910012 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910118 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910216 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910303 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910432 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910518 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910605 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910676 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910772 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910858 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910942 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911095 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911193 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911275 4636 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911337 4636 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911410 4636 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911470 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912134 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912177 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912202 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912220 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912237 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912259 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912277 4636 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912291 4636 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912314 4636 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912330 4636 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912365 4636 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912382 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912398 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912415 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912431 4636 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912445 4636 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912462 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912478 4636 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912492 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912506 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912524 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912539 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912553 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912568 4636 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912584 4636 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912597 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912612 4636 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912624 4636 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912637 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912651 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912665 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912679 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912694 4636 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912711 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.900952 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901242 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.895404 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.895436 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.895620 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.896166 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.896273 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.896863 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.897144 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.897235 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.897352 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.897300 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.897769 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.897983 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.898004 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.895243 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901448 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901472 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901681 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.901946 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902223 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902374 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902470 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902526 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902558 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.902820 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903054 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903270 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.903567 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.904041 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.904368 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.904516 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.904688 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.904777 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.904790 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.904886 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.905073 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.905145 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.905468 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.905531 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.905609 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.905652 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.906004 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.906211 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.906348 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.906383 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.906383 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.906390 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.906438 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.906464 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.906687 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907201 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907212 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907298 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907502 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907602 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.915452 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907604 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907727 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907891 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.907943 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.908052 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.908461 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.908493 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.908640 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.908860 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.908885 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.908915 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909557 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909714 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.909985 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910036 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910276 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910353 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910567 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910572 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910609 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910634 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910771 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910922 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.910970 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911074 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911188 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911205 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911300 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911506 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911665 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.911708 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912008 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912450 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.912762 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.913009 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.913152 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.916077 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.916307 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.916352 4636 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.916447 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.916659 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.916809 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.916914 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.916940 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.917068 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.917181 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.917427 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.917727 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.917979 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.917996 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.918208 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.918455 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.918512 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.918702 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.918829 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.918927 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.918955 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.916253 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919050 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919128 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919427 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919427 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919453 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919493 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919511 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919552 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919655 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919806 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919809 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.919968 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.920139 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.920309 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.920362 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.920571 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.921508 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.921529 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.922411 4636 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.922499 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:12.422475737 +0000 UTC m=+22.281202204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.925237 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.925350 4636 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.925472 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:12.425442172 +0000 UTC m=+22.284168619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.926727 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.926744 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.928299 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.935341 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.936013 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.938624 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.939473 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.940747 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.940872 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.940940 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.941173 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.941332 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.941364 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.945228 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.945253 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.945270 4636 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.945342 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:12.445319954 +0000 UTC m=+22.304046201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.947812 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.949226 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.949238 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.949497 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.949591 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.952350 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.952587 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.952850 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.953282 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.955540 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.966815 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.967270 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.967310 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.967328 4636 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:11 crc kubenswrapper[4636]: E1003 14:01:11.967400 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:12.467374831 +0000 UTC m=+22.326101268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.967701 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.968597 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.975789 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.976221 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:11 crc kubenswrapper[4636]: I1003 14:01:11.982117 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014470 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014544 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014590 4636 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014601 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014610 4636 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014619 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014627 4636 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014635 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014643 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014651 4636 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014660 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014670 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014678 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014687 4636 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014695 4636 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014703 4636 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014710 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014718 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014726 4636 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014735 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014743 4636 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014750 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014758 4636 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014765 4636 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014773 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014780 4636 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014771 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014821 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014788 4636 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014857 4636 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014872 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014885 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014896 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014908 4636 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014922 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014936 4636 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014948 4636 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014960 4636 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014974 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014985 4636 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.014997 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015008 4636 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015019 4636 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015033 4636 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015045 4636 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015056 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015068 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015080 4636 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015092 4636 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015125 4636 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015136 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015147 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015158 4636 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015169 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015180 4636 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015191 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015201 4636 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015212 4636 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015223 4636 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015236 4636 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015247 4636 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015258 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015268 4636 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015279 4636 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015292 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015303 4636 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015314 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015326 4636 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015338 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015350 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015364 4636 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015377 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015388 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015400 4636 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015413 4636 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015423 4636 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015434 4636 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015447 4636 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015459 4636 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015472 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015483 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015495 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015506 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015519 4636 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015532 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015544 4636 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015571 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015590 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015602 4636 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015613 4636 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015625 4636 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015638 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015650 4636 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015661 4636 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015673 4636 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015684 4636 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015696 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015709 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015721 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015732 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015745 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015756 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015768 4636 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015778 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015790 4636 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015801 4636 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015813 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015823 4636 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015835 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015846 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015857 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015870 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015882 4636 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015893 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015905 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015918 4636 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015930 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015941 4636 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015953 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015965 4636 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015976 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015987 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.015998 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016009 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016021 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016032 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016044 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016056 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016068 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016082 4636 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016093 4636 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016125 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016136 4636 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016147 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016158 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016169 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016180 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016192 4636 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016203 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016214 4636 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016225 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016236 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016247 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016259 4636 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016271 4636 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016282 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016295 4636 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016305 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016318 4636 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016329 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016340 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016356 4636 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016367 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016379 4636 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016390 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016403 4636 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016415 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.016426 4636 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.044843 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.054855 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.065595 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 14:01:12 crc kubenswrapper[4636]: W1003 14:01:12.074433 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-43db42918db572342331a8fcea2bded13610ee5f00c3da6f195ea8eba1a7c626 WatchSource:0}: Error finding container 43db42918db572342331a8fcea2bded13610ee5f00c3da6f195ea8eba1a7c626: Status 404 returned error can't find the container with id 43db42918db572342331a8fcea2bded13610ee5f00c3da6f195ea8eba1a7c626 Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.421123 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.421329 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:01:13.42130493 +0000 UTC m=+23.280031177 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.521705 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.521748 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.521784 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.521815 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.521903 4636 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.521937 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.521966 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:13.521951403 +0000 UTC m=+23.380677650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.521908 4636 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.521982 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.521990 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.522015 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.522029 4636 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.522001 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:13.521993734 +0000 UTC m=+23.380719981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.521997 4636 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.522146 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:13.522081076 +0000 UTC m=+23.380807493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:12 crc kubenswrapper[4636]: E1003 14:01:12.522175 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:13.522164798 +0000 UTC m=+23.380891265 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.756377 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-xf7xs"] Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.756743 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xf7xs" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.768107 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.770361 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.772484 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.777125 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.777239 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-r9xm2"] Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.777978 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r9xm2" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.780186 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.780510 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.780811 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.789202 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.797435 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.798136 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.810628 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.820603 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.830111 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.839838 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.845843 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.853895 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.864271 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.875150 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.883217 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.892432 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.902996 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.903779 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.904223 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.905768 4636 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf" exitCode=255 Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.914175 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.924423 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61e13aef-fd75-4e3e-a84d-44093600f786-hosts-file\") pod \"node-resolver-r9xm2\" (UID: \"61e13aef-fd75-4e3e-a84d-44093600f786\") " pod="openshift-dns/node-resolver-r9xm2" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.924462 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbp4f\" (UniqueName: \"kubernetes.io/projected/686acf3e-9445-4e3f-9a49-d714556a8e52-kube-api-access-fbp4f\") pod \"node-ca-xf7xs\" (UID: \"686acf3e-9445-4e3f-9a49-d714556a8e52\") " pod="openshift-image-registry/node-ca-xf7xs" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.924484 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2ksq\" (UniqueName: \"kubernetes.io/projected/61e13aef-fd75-4e3e-a84d-44093600f786-kube-api-access-w2ksq\") pod \"node-resolver-r9xm2\" (UID: \"61e13aef-fd75-4e3e-a84d-44093600f786\") " pod="openshift-dns/node-resolver-r9xm2" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.924523 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/686acf3e-9445-4e3f-9a49-d714556a8e52-host\") pod \"node-ca-xf7xs\" (UID: \"686acf3e-9445-4e3f-9a49-d714556a8e52\") " pod="openshift-image-registry/node-ca-xf7xs" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.924576 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/686acf3e-9445-4e3f-9a49-d714556a8e52-serviceca\") pod \"node-ca-xf7xs\" (UID: \"686acf3e-9445-4e3f-9a49-d714556a8e52\") " pod="openshift-image-registry/node-ca-xf7xs" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.926548 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:12 crc kubenswrapper[4636]: I1003 14:01:12.947114 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.025988 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61e13aef-fd75-4e3e-a84d-44093600f786-hosts-file\") pod \"node-resolver-r9xm2\" (UID: \"61e13aef-fd75-4e3e-a84d-44093600f786\") " pod="openshift-dns/node-resolver-r9xm2" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.026078 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2ksq\" (UniqueName: \"kubernetes.io/projected/61e13aef-fd75-4e3e-a84d-44093600f786-kube-api-access-w2ksq\") pod \"node-resolver-r9xm2\" (UID: \"61e13aef-fd75-4e3e-a84d-44093600f786\") " pod="openshift-dns/node-resolver-r9xm2" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.026200 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbp4f\" (UniqueName: \"kubernetes.io/projected/686acf3e-9445-4e3f-9a49-d714556a8e52-kube-api-access-fbp4f\") pod \"node-ca-xf7xs\" (UID: \"686acf3e-9445-4e3f-9a49-d714556a8e52\") " pod="openshift-image-registry/node-ca-xf7xs" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.026301 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/686acf3e-9445-4e3f-9a49-d714556a8e52-host\") pod \"node-ca-xf7xs\" (UID: \"686acf3e-9445-4e3f-9a49-d714556a8e52\") " pod="openshift-image-registry/node-ca-xf7xs" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.026330 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/61e13aef-fd75-4e3e-a84d-44093600f786-hosts-file\") pod \"node-resolver-r9xm2\" (UID: \"61e13aef-fd75-4e3e-a84d-44093600f786\") " pod="openshift-dns/node-resolver-r9xm2" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.026351 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/686acf3e-9445-4e3f-9a49-d714556a8e52-serviceca\") pod \"node-ca-xf7xs\" (UID: \"686acf3e-9445-4e3f-9a49-d714556a8e52\") " pod="openshift-image-registry/node-ca-xf7xs" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.026428 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/686acf3e-9445-4e3f-9a49-d714556a8e52-host\") pod \"node-ca-xf7xs\" (UID: \"686acf3e-9445-4e3f-9a49-d714556a8e52\") " pod="openshift-image-registry/node-ca-xf7xs" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.049541 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbp4f\" (UniqueName: \"kubernetes.io/projected/686acf3e-9445-4e3f-9a49-d714556a8e52-kube-api-access-fbp4f\") pod \"node-ca-xf7xs\" (UID: \"686acf3e-9445-4e3f-9a49-d714556a8e52\") " pod="openshift-image-registry/node-ca-xf7xs" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.192969 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.193631 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.194231 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.194736 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.195378 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.195948 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.196535 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.197010 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.197535 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.198205 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.198721 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.199281 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.199858 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.200380 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.200938 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.201379 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/686acf3e-9445-4e3f-9a49-d714556a8e52-serviceca\") pod \"node-ca-xf7xs\" (UID: \"686acf3e-9445-4e3f-9a49-d714556a8e52\") " pod="openshift-image-registry/node-ca-xf7xs" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.201398 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.202039 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.205244 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.205716 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.206277 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.206688 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.207337 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.207778 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.208413 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.209055 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.209600 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.210204 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.210713 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.213192 4636 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.213302 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.215128 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.216019 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.216532 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.218158 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.218160 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2ksq\" (UniqueName: \"kubernetes.io/projected/61e13aef-fd75-4e3e-a84d-44093600f786-kube-api-access-w2ksq\") pod \"node-resolver-r9xm2\" (UID: \"61e13aef-fd75-4e3e-a84d-44093600f786\") " pod="openshift-dns/node-resolver-r9xm2" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.218980 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.220390 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.221180 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.222244 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.222755 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.223713 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.224837 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.225498 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.226393 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.226915 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.227815 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.228596 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.229474 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.230003 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.230642 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.231625 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.232233 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.233522 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.234521 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c"} Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.234562 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-ltsq6"] Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.234800 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-lbt25"] Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.235141 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.235465 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ce109b66ff30ed4f5b6f1fda865754404e5fd0bf7333cbb6580ad3b38a079d0a"} Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.235486 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-ngmch"] Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.235649 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.235676 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.235677 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.235712 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.235822 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.236080 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.236326 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.236592 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf"} Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.236614 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1e423ca9aac2748d56a0c4956615ebca2523d53fa1d5a309aedbaa53fbc8e987"} Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.236690 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.236624 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6"} Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.236745 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"43db42918db572342331a8fcea2bded13610ee5f00c3da6f195ea8eba1a7c626"} Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.236762 4636 scope.go:117] "RemoveContainer" containerID="d9281dd902e3637bd0ecf7f8918a383fe4b3c03afa7f76898c8e0c6a6ed471ba" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.237474 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.237771 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.238411 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.238633 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.242326 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.242774 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.242899 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.243557 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.244469 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.250900 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.250954 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.251328 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.254694 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.268891 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.282153 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.295270 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.306201 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.313929 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.325982 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330213 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-cnibin\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330264 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-var-lib-cni-bin\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330284 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c2470111-1b59-4048-89ff-2b7e83659200-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330319 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmm8v\" (UniqueName: \"kubernetes.io/projected/f078d6dd-d81e-4a06-aca1-508bf23a2170-kube-api-access-fmm8v\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330336 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svdtt\" (UniqueName: \"kubernetes.io/projected/c2470111-1b59-4048-89ff-2b7e83659200-kube-api-access-svdtt\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330351 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-socket-dir-parent\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330439 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-system-cni-dir\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330534 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-cni-dir\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330581 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-hostroot\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330611 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-conf-dir\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330633 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-etc-kubernetes\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330676 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2470111-1b59-4048-89ff-2b7e83659200-cni-binary-copy\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330708 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-os-release\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330724 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-run-multus-certs\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330739 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-var-lib-cni-multus\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330756 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-os-release\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330777 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330792 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-var-lib-kubelet\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330807 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f078d6dd-d81e-4a06-aca1-508bf23a2170-rootfs\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330826 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-run-k8s-cni-cncf-io\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330885 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-run-netns\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330914 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-system-cni-dir\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330943 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-daemon-config\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330965 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-cnibin\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.330994 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f078d6dd-d81e-4a06-aca1-508bf23a2170-mcd-auth-proxy-config\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.331017 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/140a698f-2661-4dc8-86d9-929b0d6dd326-cni-binary-copy\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.331038 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5fmd\" (UniqueName: \"kubernetes.io/projected/140a698f-2661-4dc8-86d9-929b0d6dd326-kube-api-access-n5fmd\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.331076 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f078d6dd-d81e-4a06-aca1-508bf23a2170-proxy-tls\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.335698 4636 scope.go:117] "RemoveContainer" containerID="372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf" Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.335918 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.341321 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.345617 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.367051 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xf7xs" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.383753 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.388882 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r9xm2" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432281 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432353 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-system-cni-dir\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432375 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-cni-dir\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432391 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-hostroot\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432405 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-conf-dir\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.432440 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:01:15.432411636 +0000 UTC m=+25.291137883 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432450 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-conf-dir\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432487 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-etc-kubernetes\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432513 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-hostroot\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432582 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-run-multus-certs\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432579 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-etc-kubernetes\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432630 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-cni-dir\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432639 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-system-cni-dir\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432647 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2470111-1b59-4048-89ff-2b7e83659200-cni-binary-copy\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432720 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-os-release\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432740 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-var-lib-cni-multus\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432758 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-os-release\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432612 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-run-multus-certs\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432775 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432792 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-run-k8s-cni-cncf-io\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432803 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-var-lib-cni-multus\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432806 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-var-lib-kubelet\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432825 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-var-lib-kubelet\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432833 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f078d6dd-d81e-4a06-aca1-508bf23a2170-rootfs\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432862 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-run-netns\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432877 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-system-cni-dir\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432897 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-cnibin\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432922 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-daemon-config\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432944 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/140a698f-2661-4dc8-86d9-929b0d6dd326-cni-binary-copy\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432965 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f078d6dd-d81e-4a06-aca1-508bf23a2170-mcd-auth-proxy-config\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432978 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-os-release\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432981 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5fmd\" (UniqueName: \"kubernetes.io/projected/140a698f-2661-4dc8-86d9-929b0d6dd326-kube-api-access-n5fmd\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.432999 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f078d6dd-d81e-4a06-aca1-508bf23a2170-proxy-tls\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433015 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-cnibin\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433015 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-os-release\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433029 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-var-lib-cni-bin\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433045 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c2470111-1b59-4048-89ff-2b7e83659200-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433065 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmm8v\" (UniqueName: \"kubernetes.io/projected/f078d6dd-d81e-4a06-aca1-508bf23a2170-kube-api-access-fmm8v\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433090 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svdtt\" (UniqueName: \"kubernetes.io/projected/c2470111-1b59-4048-89ff-2b7e83659200-kube-api-access-svdtt\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433133 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-socket-dir-parent\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433160 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-var-lib-cni-bin\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433188 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-socket-dir-parent\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433205 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f078d6dd-d81e-4a06-aca1-508bf23a2170-rootfs\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433065 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-run-k8s-cni-cncf-io\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433255 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-host-run-netns\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433368 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/140a698f-2661-4dc8-86d9-929b0d6dd326-cnibin\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433569 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-cnibin\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.433193 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-system-cni-dir\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.533769 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.533813 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.533842 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.533896 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.533957 4636 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.533990 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.534002 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.534012 4636 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.534022 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:15.534005893 +0000 UTC m=+25.392732140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.534017 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.534057 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.534071 4636 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.534038 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:15.534029714 +0000 UTC m=+25.392755961 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.534154 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:15.534128356 +0000 UTC m=+25.392854793 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.534394 4636 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.534442 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:15.534430664 +0000 UTC m=+25.393156911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.670191 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.671526 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t7xd5"] Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.676146 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.679467 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.680412 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.680693 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.680962 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.680999 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.683319 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.684026 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.693852 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.707022 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.713975 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.723779 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.738024 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.739249 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.755271 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.772089 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9281dd902e3637bd0ecf7f8918a383fe4b3c03afa7f76898c8e0c6a6ed471ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:00:55Z\\\",\\\"message\\\":\\\"W1003 14:00:54.079829 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 14:00:54.080140 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759500054 cert, and key in /tmp/serving-cert-190172997/serving-signer.crt, /tmp/serving-cert-190172997/serving-signer.key\\\\nI1003 14:00:54.601697 1 observer_polling.go:159] Starting file observer\\\\nW1003 14:00:54.604188 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 14:00:54.604339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:00:54.605027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-190172997/tls.crt::/tmp/serving-cert-190172997/tls.key\\\\\\\"\\\\nF1003 14:00:55.021395 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.784366 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.796152 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.809214 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.819312 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.827887 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.835966 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837217 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837243 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-netd\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837269 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-slash\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837355 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-ovn\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837416 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-kubelet\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837440 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p9qj\" (UniqueName: \"kubernetes.io/projected/564529e3-ff40-4923-9f6d-319a9b41720a-kube-api-access-2p9qj\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837482 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-node-log\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837531 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-netns\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837560 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-script-lib\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837596 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-bin\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837657 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-systemd\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837683 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-env-overrides\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837718 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-etc-openvswitch\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837750 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-openvswitch\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837772 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-log-socket\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837807 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-var-lib-openvswitch\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837836 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/564529e3-ff40-4923-9f6d-319a9b41720a-ovn-node-metrics-cert\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837855 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-systemd-units\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837889 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.837914 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-config\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.851573 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.859992 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.870391 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.879021 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.889652 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.897376 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.906510 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.911536 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xf7xs" event={"ID":"686acf3e-9445-4e3f-9a49-d714556a8e52","Type":"ContainerStarted","Data":"6457fd6e5ab181dd2e0023bacd2274d51dac5a09a3fec79b7013f7c5f75be3f3"} Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.912997 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4"} Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.913799 4636 scope.go:117] "RemoveContainer" containerID="372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf" Oct 03 14:01:13 crc kubenswrapper[4636]: E1003 14:01:13.913950 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.915577 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.922915 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.929344 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938491 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-systemd-units\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938553 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938580 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-config\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938606 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938624 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-netd\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938640 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-slash\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938635 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-systemd-units\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938659 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-ovn\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938718 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-kubelet\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938723 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-netd\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938742 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p9qj\" (UniqueName: \"kubernetes.io/projected/564529e3-ff40-4923-9f6d-319a9b41720a-kube-api-access-2p9qj\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938751 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938778 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-node-log\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938795 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-kubelet\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938823 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938827 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-netns\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938849 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-node-log\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938852 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-script-lib\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938872 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-netns\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938880 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-bin\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938938 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-systemd\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938966 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-env-overrides\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.938997 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-ovn\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939007 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-bin\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939114 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-systemd\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939144 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-etc-openvswitch\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939176 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-openvswitch\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939170 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-etc-openvswitch\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939197 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-log-socket\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939220 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-openvswitch\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939228 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-var-lib-openvswitch\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939133 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-slash\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939263 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-var-lib-openvswitch\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939266 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/564529e3-ff40-4923-9f6d-319a9b41720a-ovn-node-metrics-cert\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.939260 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-log-socket\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.940722 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9281dd902e3637bd0ecf7f8918a383fe4b3c03afa7f76898c8e0c6a6ed471ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:00:55Z\\\",\\\"message\\\":\\\"W1003 14:00:54.079829 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 14:00:54.080140 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759500054 cert, and key in /tmp/serving-cert-190172997/serving-signer.crt, /tmp/serving-cert-190172997/serving-signer.key\\\\nI1003 14:00:54.601697 1 observer_polling.go:159] Starting file observer\\\\nW1003 14:00:54.604188 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 14:00:54.604339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 14:00:54.605027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-190172997/tls.crt::/tmp/serving-cert-190172997/tls.key\\\\\\\"\\\\nF1003 14:00:55.021395 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.957083 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.968228 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.980362 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:13 crc kubenswrapper[4636]: I1003 14:01:13.991513 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.003191 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.011621 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.027059 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.038015 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.049339 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.056455 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.066169 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.076870 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.088645 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.097924 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.107468 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.121390 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.196471 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2470111-1b59-4048-89ff-2b7e83659200-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.262576 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/140a698f-2661-4dc8-86d9-929b0d6dd326-multus-daemon-config\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.262698 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/140a698f-2661-4dc8-86d9-929b0d6dd326-cni-binary-copy\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.264190 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmm8v\" (UniqueName: \"kubernetes.io/projected/f078d6dd-d81e-4a06-aca1-508bf23a2170-kube-api-access-fmm8v\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.264556 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svdtt\" (UniqueName: \"kubernetes.io/projected/c2470111-1b59-4048-89ff-2b7e83659200-kube-api-access-svdtt\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.264679 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c2470111-1b59-4048-89ff-2b7e83659200-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.264925 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2470111-1b59-4048-89ff-2b7e83659200-cni-binary-copy\") pod \"multus-additional-cni-plugins-lbt25\" (UID: \"c2470111-1b59-4048-89ff-2b7e83659200\") " pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.285180 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-config\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.286015 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-script-lib\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.286253 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-env-overrides\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.287062 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f078d6dd-d81e-4a06-aca1-508bf23a2170-mcd-auth-proxy-config\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.288285 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f078d6dd-d81e-4a06-aca1-508bf23a2170-proxy-tls\") pod \"machine-config-daemon-ngmch\" (UID: \"f078d6dd-d81e-4a06-aca1-508bf23a2170\") " pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.292252 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5fmd\" (UniqueName: \"kubernetes.io/projected/140a698f-2661-4dc8-86d9-929b0d6dd326-kube-api-access-n5fmd\") pod \"multus-ltsq6\" (UID: \"140a698f-2661-4dc8-86d9-929b0d6dd326\") " pod="openshift-multus/multus-ltsq6" Oct 03 14:01:14 crc kubenswrapper[4636]: W1003 14:01:14.299373 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61e13aef_fd75_4e3e_a84d_44093600f786.slice/crio-84121e988c13f8aa2af0ddf3c10a5636d19aceb6246a6277feff22fab2559723 WatchSource:0}: Error finding container 84121e988c13f8aa2af0ddf3c10a5636d19aceb6246a6277feff22fab2559723: Status 404 returned error can't find the container with id 84121e988c13f8aa2af0ddf3c10a5636d19aceb6246a6277feff22fab2559723 Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.390751 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/564529e3-ff40-4923-9f6d-319a9b41720a-ovn-node-metrics-cert\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.394972 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p9qj\" (UniqueName: \"kubernetes.io/projected/564529e3-ff40-4923-9f6d-319a9b41720a-kube-api-access-2p9qj\") pod \"ovnkube-node-t7xd5\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.453317 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ltsq6" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.460970 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lbt25" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.465938 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:01:14 crc kubenswrapper[4636]: W1003 14:01:14.475521 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod140a698f_2661_4dc8_86d9_929b0d6dd326.slice/crio-c72d1cac2eec00d671373157122a8cc3fe117714285ae8b5811edaf7cfbac820 WatchSource:0}: Error finding container c72d1cac2eec00d671373157122a8cc3fe117714285ae8b5811edaf7cfbac820: Status 404 returned error can't find the container with id c72d1cac2eec00d671373157122a8cc3fe117714285ae8b5811edaf7cfbac820 Oct 03 14:01:14 crc kubenswrapper[4636]: W1003 14:01:14.489679 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2470111_1b59_4048_89ff_2b7e83659200.slice/crio-72285a910875445e7986f22a86cd0604bd3090fd03926097f724c2ddb17877ca WatchSource:0}: Error finding container 72285a910875445e7986f22a86cd0604bd3090fd03926097f724c2ddb17877ca: Status 404 returned error can't find the container with id 72285a910875445e7986f22a86cd0604bd3090fd03926097f724c2ddb17877ca Oct 03 14:01:14 crc kubenswrapper[4636]: W1003 14:01:14.491963 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf078d6dd_d81e_4a06_aca1_508bf23a2170.slice/crio-5a42e7c984731a01a2b97e02a58d1ae9f152e9ece26f9894ed8832e009f24132 WatchSource:0}: Error finding container 5a42e7c984731a01a2b97e02a58d1ae9f152e9ece26f9894ed8832e009f24132: Status 404 returned error can't find the container with id 5a42e7c984731a01a2b97e02a58d1ae9f152e9ece26f9894ed8832e009f24132 Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.585292 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:14 crc kubenswrapper[4636]: W1003 14:01:14.616355 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod564529e3_ff40_4923_9f6d_319a9b41720a.slice/crio-44658c054ab5f0148595b86dd04430dc1277d05a6f5bf8018cdf03f7318586fa WatchSource:0}: Error finding container 44658c054ab5f0148595b86dd04430dc1277d05a6f5bf8018cdf03f7318586fa: Status 404 returned error can't find the container with id 44658c054ab5f0148595b86dd04430dc1277d05a6f5bf8018cdf03f7318586fa Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.793279 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:14 crc kubenswrapper[4636]: E1003 14:01:14.793405 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.793731 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.793761 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:14 crc kubenswrapper[4636]: E1003 14:01:14.793791 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:14 crc kubenswrapper[4636]: E1003 14:01:14.793844 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.916740 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f"} Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.916782 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"5a42e7c984731a01a2b97e02a58d1ae9f152e9ece26f9894ed8832e009f24132"} Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.917712 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r9xm2" event={"ID":"61e13aef-fd75-4e3e-a84d-44093600f786","Type":"ContainerStarted","Data":"5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a"} Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.917733 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r9xm2" event={"ID":"61e13aef-fd75-4e3e-a84d-44093600f786","Type":"ContainerStarted","Data":"84121e988c13f8aa2af0ddf3c10a5636d19aceb6246a6277feff22fab2559723"} Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.919082 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" event={"ID":"c2470111-1b59-4048-89ff-2b7e83659200","Type":"ContainerStarted","Data":"72285a910875445e7986f22a86cd0604bd3090fd03926097f724c2ddb17877ca"} Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.920699 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.923332 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xf7xs" event={"ID":"686acf3e-9445-4e3f-9a49-d714556a8e52","Type":"ContainerStarted","Data":"1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834"} Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.924455 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltsq6" event={"ID":"140a698f-2661-4dc8-86d9-929b0d6dd326","Type":"ContainerStarted","Data":"45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57"} Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.924488 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltsq6" event={"ID":"140a698f-2661-4dc8-86d9-929b0d6dd326","Type":"ContainerStarted","Data":"c72d1cac2eec00d671373157122a8cc3fe117714285ae8b5811edaf7cfbac820"} Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.925445 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80"} Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.925472 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"44658c054ab5f0148595b86dd04430dc1277d05a6f5bf8018cdf03f7318586fa"} Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.928841 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.940596 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.960153 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.971312 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.982306 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:14 crc kubenswrapper[4636]: I1003 14:01:14.991798 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.002889 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.011012 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.018135 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.024870 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.035638 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.044162 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.052847 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.064941 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.073907 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.085304 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.101754 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.112427 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.122297 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.130091 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.140346 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.150160 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.159173 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.166491 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.175941 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.184671 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.194739 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.206078 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.455041 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.455213 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:01:19.455189152 +0000 UTC m=+29.313915409 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.556355 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.556397 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.556423 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.556449 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556553 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556552 4636 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556567 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556582 4636 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556600 4636 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556622 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:19.556606525 +0000 UTC m=+29.415332772 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556726 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:19.556702247 +0000 UTC m=+29.415428504 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556743 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:19.556734798 +0000 UTC m=+29.415461055 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556833 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556850 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556865 4636 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:15 crc kubenswrapper[4636]: E1003 14:01:15.556907 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:19.556892942 +0000 UTC m=+29.415619439 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.930962 4636 generic.go:334] "Generic (PLEG): container finished" podID="c2470111-1b59-4048-89ff-2b7e83659200" containerID="1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25" exitCode=0 Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.931041 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" event={"ID":"c2470111-1b59-4048-89ff-2b7e83659200","Type":"ContainerDied","Data":"1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25"} Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.934176 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5"} Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.936089 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80" exitCode=0 Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.936208 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80"} Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.938242 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287"} Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.947212 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.962018 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:15 crc kubenswrapper[4636]: I1003 14:01:15.988506 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:15Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.011523 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.028568 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.043651 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.054767 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.064502 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.074034 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.086961 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.099726 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.120090 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.150914 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.172250 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.193975 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.215878 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.226673 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.238357 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.251969 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.265148 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.279091 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.297765 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.309600 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.322092 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.341245 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.360173 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.376509 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.389351 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.793171 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:16 crc kubenswrapper[4636]: E1003 14:01:16.793291 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.793388 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:16 crc kubenswrapper[4636]: E1003 14:01:16.793528 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.793584 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:16 crc kubenswrapper[4636]: E1003 14:01:16.793634 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.944852 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc"} Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.944946 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847"} Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.944957 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4"} Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.947876 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" event={"ID":"c2470111-1b59-4048-89ff-2b7e83659200","Type":"ContainerStarted","Data":"d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464"} Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.964249 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.982717 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:16 crc kubenswrapper[4636]: I1003 14:01:16.999528 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:16Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.013719 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.034395 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.047402 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.060309 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.074252 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.088924 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.103342 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.119610 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.146881 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.158648 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.169010 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.387453 4636 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.389376 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.389410 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.389417 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.389518 4636 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.396010 4636 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.396203 4636 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.397248 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.397277 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.397289 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.397306 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.397319 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:17Z","lastTransitionTime":"2025-10-03T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:17 crc kubenswrapper[4636]: E1003 14:01:17.413995 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.417455 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.417500 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.417513 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.417529 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.417538 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:17Z","lastTransitionTime":"2025-10-03T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:17 crc kubenswrapper[4636]: E1003 14:01:17.431001 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.436690 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.436725 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.436737 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.436756 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.436767 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:17Z","lastTransitionTime":"2025-10-03T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:17 crc kubenswrapper[4636]: E1003 14:01:17.451533 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.455967 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.456025 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.456044 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.456068 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.456085 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:17Z","lastTransitionTime":"2025-10-03T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:17 crc kubenswrapper[4636]: E1003 14:01:17.469476 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.473205 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.473245 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.473258 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.473276 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.473290 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:17Z","lastTransitionTime":"2025-10-03T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:17 crc kubenswrapper[4636]: E1003 14:01:17.487742 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: E1003 14:01:17.487850 4636 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.489954 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.489981 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.489990 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.490004 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.490040 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:17Z","lastTransitionTime":"2025-10-03T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.592676 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.592705 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.592714 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.592728 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.592737 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:17Z","lastTransitionTime":"2025-10-03T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.695404 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.695450 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.695458 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.695470 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.695480 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:17Z","lastTransitionTime":"2025-10-03T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.799211 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.799424 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.799434 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.799448 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.799459 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:17Z","lastTransitionTime":"2025-10-03T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.902194 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.902226 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.902234 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.902247 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.902255 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:17Z","lastTransitionTime":"2025-10-03T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.952606 4636 generic.go:334] "Generic (PLEG): container finished" podID="c2470111-1b59-4048-89ff-2b7e83659200" containerID="d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464" exitCode=0 Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.952660 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" event={"ID":"c2470111-1b59-4048-89ff-2b7e83659200","Type":"ContainerDied","Data":"d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464"} Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.958411 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422"} Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.958444 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4"} Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.958453 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080"} Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.978306 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:17 crc kubenswrapper[4636]: I1003 14:01:17.999083 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:17Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.005348 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.005384 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.005393 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.005407 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.005416 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:18Z","lastTransitionTime":"2025-10-03T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.014937 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.028812 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.045848 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.053542 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.054246 4636 scope.go:117] "RemoveContainer" containerID="372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf" Oct 03 14:01:18 crc kubenswrapper[4636]: E1003 14:01:18.054441 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.059560 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.072731 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.085181 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.097998 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.111202 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.111250 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.111261 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.111279 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.111293 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:18Z","lastTransitionTime":"2025-10-03T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.116503 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.159340 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.182154 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.200227 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.214462 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.214498 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.214508 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.214523 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.214533 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:18Z","lastTransitionTime":"2025-10-03T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.216193 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.316791 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.316849 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.316858 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.316874 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.316884 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:18Z","lastTransitionTime":"2025-10-03T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.422584 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.422632 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.422642 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.422659 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.422703 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:18Z","lastTransitionTime":"2025-10-03T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.525228 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.525294 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.525311 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.525334 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.525348 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:18Z","lastTransitionTime":"2025-10-03T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.628218 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.628259 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.628268 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.628283 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.628292 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:18Z","lastTransitionTime":"2025-10-03T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.731203 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.731234 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.731243 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.731255 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.731264 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:18Z","lastTransitionTime":"2025-10-03T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.792746 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:18 crc kubenswrapper[4636]: E1003 14:01:18.792871 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.792746 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.793153 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:18 crc kubenswrapper[4636]: E1003 14:01:18.793281 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:18 crc kubenswrapper[4636]: E1003 14:01:18.793471 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.833051 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.833093 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.833120 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.833137 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.833149 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:18Z","lastTransitionTime":"2025-10-03T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.934797 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.934831 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.934840 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.934857 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.934866 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:18Z","lastTransitionTime":"2025-10-03T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.963139 4636 generic.go:334] "Generic (PLEG): container finished" podID="c2470111-1b59-4048-89ff-2b7e83659200" containerID="e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e" exitCode=0 Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.963212 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" event={"ID":"c2470111-1b59-4048-89ff-2b7e83659200","Type":"ContainerDied","Data":"e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e"} Oct 03 14:01:18 crc kubenswrapper[4636]: I1003 14:01:18.985725 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.002499 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:18Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.023931 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.038056 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.038110 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.038127 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.038145 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.038157 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:19Z","lastTransitionTime":"2025-10-03T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.041907 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.058654 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.069884 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.081894 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.091179 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.101229 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.120931 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.133827 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.141425 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.141452 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.141460 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.141474 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.141483 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:19Z","lastTransitionTime":"2025-10-03T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.146180 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.157299 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.176440 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.243351 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.243812 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.243884 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.243949 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.244017 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:19Z","lastTransitionTime":"2025-10-03T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.347456 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.347492 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.347500 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.347515 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.347524 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:19Z","lastTransitionTime":"2025-10-03T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.450743 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.450797 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.450816 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.450840 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.450858 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:19Z","lastTransitionTime":"2025-10-03T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.518912 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.519204 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:01:27.51915555 +0000 UTC m=+37.377881837 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.553739 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.553777 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.553788 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.553806 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.553817 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:19Z","lastTransitionTime":"2025-10-03T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.620388 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.620448 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.620486 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.620531 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620588 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620607 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620626 4636 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620635 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620688 4636 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620689 4636 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620694 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:27.620671675 +0000 UTC m=+37.479397932 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620615 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620753 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:27.620736357 +0000 UTC m=+37.479462614 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620760 4636 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620772 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:27.620763487 +0000 UTC m=+37.479489744 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:19 crc kubenswrapper[4636]: E1003 14:01:19.620803 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:27.620781768 +0000 UTC m=+37.479508085 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.656358 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.656394 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.656404 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.656420 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.656430 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:19Z","lastTransitionTime":"2025-10-03T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.758352 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.758385 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.758395 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.758408 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.758418 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:19Z","lastTransitionTime":"2025-10-03T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.861031 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.861068 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.861078 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.861112 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.861125 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:19Z","lastTransitionTime":"2025-10-03T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.963362 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.963400 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.963412 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.963451 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.963465 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:19Z","lastTransitionTime":"2025-10-03T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.969984 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.971939 4636 generic.go:334] "Generic (PLEG): container finished" podID="c2470111-1b59-4048-89ff-2b7e83659200" containerID="950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019" exitCode=0 Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.971966 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" event={"ID":"c2470111-1b59-4048-89ff-2b7e83659200","Type":"ContainerDied","Data":"950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019"} Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.985331 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:19 crc kubenswrapper[4636]: I1003 14:01:19.997492 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.014554 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.031692 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.045775 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.064892 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.065847 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.065874 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.065885 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.065900 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.065914 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:20Z","lastTransitionTime":"2025-10-03T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.088197 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.100750 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.113443 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.124078 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.134294 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.143981 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.156587 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.168729 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.168767 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.168776 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.168791 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.168800 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:20Z","lastTransitionTime":"2025-10-03T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.173277 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.271492 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.271527 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.271535 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.271548 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.271558 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:20Z","lastTransitionTime":"2025-10-03T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.373663 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.373705 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.373716 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.373731 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.373742 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:20Z","lastTransitionTime":"2025-10-03T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.475995 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.476034 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.476047 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.476064 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.476074 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:20Z","lastTransitionTime":"2025-10-03T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.578600 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.578631 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.578640 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.578652 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.578661 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:20Z","lastTransitionTime":"2025-10-03T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.681532 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.681583 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.681599 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.681622 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.681639 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:20Z","lastTransitionTime":"2025-10-03T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.784186 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.784237 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.784252 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.784270 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.784285 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:20Z","lastTransitionTime":"2025-10-03T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.793477 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:20 crc kubenswrapper[4636]: E1003 14:01:20.793596 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.793622 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:20 crc kubenswrapper[4636]: E1003 14:01:20.793795 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.793870 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:20 crc kubenswrapper[4636]: E1003 14:01:20.793969 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.804895 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.816047 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.830804 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.843138 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.853928 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.874337 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.886445 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.886497 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.886514 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.886537 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.886552 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:20Z","lastTransitionTime":"2025-10-03T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.890846 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.906153 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.915538 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.932687 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.945711 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.961156 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.975815 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.988452 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.988482 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.988490 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.988504 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.988515 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:20Z","lastTransitionTime":"2025-10-03T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:20 crc kubenswrapper[4636]: I1003 14:01:20.996188 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.091026 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.091063 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.091074 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.091124 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.091138 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:21Z","lastTransitionTime":"2025-10-03T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.193778 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.193813 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.193823 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.193836 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.193845 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:21Z","lastTransitionTime":"2025-10-03T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.296457 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.296484 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.296492 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.296506 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.296514 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:21Z","lastTransitionTime":"2025-10-03T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.398975 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.399008 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.399016 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.399029 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.399038 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:21Z","lastTransitionTime":"2025-10-03T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.502262 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.502303 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.502311 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.502328 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.502337 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:21Z","lastTransitionTime":"2025-10-03T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.604674 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.604729 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.604742 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.604756 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.604765 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:21Z","lastTransitionTime":"2025-10-03T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.706933 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.706964 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.706973 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.706988 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.706995 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:21Z","lastTransitionTime":"2025-10-03T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.809931 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.809981 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.809994 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.810013 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.810023 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:21Z","lastTransitionTime":"2025-10-03T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.912089 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.912139 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.912158 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.912173 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.912182 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:21Z","lastTransitionTime":"2025-10-03T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:21 crc kubenswrapper[4636]: I1003 14:01:21.982290 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" event={"ID":"c2470111-1b59-4048-89ff-2b7e83659200","Type":"ContainerStarted","Data":"0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.014343 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.014421 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.014445 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.014478 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.014500 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:22Z","lastTransitionTime":"2025-10-03T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.117416 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.117891 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.117905 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.117934 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.117953 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:22Z","lastTransitionTime":"2025-10-03T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.220388 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.220434 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.220446 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.220463 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.220477 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:22Z","lastTransitionTime":"2025-10-03T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.323479 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.323521 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.323532 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.323548 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.323559 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:22Z","lastTransitionTime":"2025-10-03T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.425771 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.425803 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.425813 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.425827 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.425836 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:22Z","lastTransitionTime":"2025-10-03T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.528282 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.528357 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.528391 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.528421 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.528444 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:22Z","lastTransitionTime":"2025-10-03T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.630840 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.630884 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.630897 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.630916 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.630929 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:22Z","lastTransitionTime":"2025-10-03T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.733316 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.733360 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.733371 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.733389 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.733401 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:22Z","lastTransitionTime":"2025-10-03T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.793159 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.793235 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:22 crc kubenswrapper[4636]: E1003 14:01:22.793298 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.793159 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:22 crc kubenswrapper[4636]: E1003 14:01:22.793369 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:22 crc kubenswrapper[4636]: E1003 14:01:22.793523 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.835915 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.835945 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.835954 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.835969 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.835977 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:22Z","lastTransitionTime":"2025-10-03T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.941429 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.941481 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.941499 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.941518 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.941530 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:22Z","lastTransitionTime":"2025-10-03T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.989355 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02"} Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.989909 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:22 crc kubenswrapper[4636]: I1003 14:01:22.989937 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.003670 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.022849 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.035806 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.044236 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.044393 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.044462 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.044524 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.044582 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:23Z","lastTransitionTime":"2025-10-03T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.046445 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.061121 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.065895 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.068312 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.081150 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.095456 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.107644 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.122160 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.137583 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.146889 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.146965 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.146979 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.147006 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.147018 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:23Z","lastTransitionTime":"2025-10-03T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.151114 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.165745 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.178161 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.190897 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.206584 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.220837 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.231722 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.245343 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.249087 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.249135 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.249144 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.249156 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.249166 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:23Z","lastTransitionTime":"2025-10-03T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.263074 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.274863 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.286801 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.298621 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.312695 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.321992 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.334608 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.344885 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.351080 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.351127 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.351137 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.351152 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.351163 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:23Z","lastTransitionTime":"2025-10-03T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.354765 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.365990 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:23Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.453273 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.453301 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.453309 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.453322 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.453331 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:23Z","lastTransitionTime":"2025-10-03T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.556586 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.556641 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.556649 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.556664 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.556673 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:23Z","lastTransitionTime":"2025-10-03T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.662281 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.662305 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.662313 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.662326 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.662334 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:23Z","lastTransitionTime":"2025-10-03T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.765808 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.765845 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.765854 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.765870 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.765882 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:23Z","lastTransitionTime":"2025-10-03T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.871222 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.871278 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.871295 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.871317 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.871334 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:23Z","lastTransitionTime":"2025-10-03T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.975763 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.975797 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.975809 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.975826 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.975838 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:23Z","lastTransitionTime":"2025-10-03T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.997675 4636 generic.go:334] "Generic (PLEG): container finished" podID="c2470111-1b59-4048-89ff-2b7e83659200" containerID="0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483" exitCode=0 Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.997707 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" event={"ID":"c2470111-1b59-4048-89ff-2b7e83659200","Type":"ContainerDied","Data":"0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483"} Oct 03 14:01:23 crc kubenswrapper[4636]: I1003 14:01:23.997798 4636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.013313 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.024442 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.045409 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.056298 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.066573 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.076065 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.077994 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.078023 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.078036 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.078053 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.078065 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:24Z","lastTransitionTime":"2025-10-03T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.090024 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.101908 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.112537 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.126257 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.141414 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.156911 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.169175 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.181549 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.181794 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.181807 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.181824 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.181836 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:24Z","lastTransitionTime":"2025-10-03T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.187749 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:24Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.283603 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.283649 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.283660 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.283676 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.283690 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:24Z","lastTransitionTime":"2025-10-03T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.386176 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.386217 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.386228 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.386253 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.386263 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:24Z","lastTransitionTime":"2025-10-03T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.488837 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.489074 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.489113 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.489136 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.489151 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:24Z","lastTransitionTime":"2025-10-03T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.591890 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.591933 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.591947 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.591964 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.591976 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:24Z","lastTransitionTime":"2025-10-03T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.694500 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.694534 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.694542 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.694556 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.694565 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:24Z","lastTransitionTime":"2025-10-03T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.793369 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:24 crc kubenswrapper[4636]: E1003 14:01:24.793525 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.793952 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:24 crc kubenswrapper[4636]: E1003 14:01:24.794021 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.794077 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:24 crc kubenswrapper[4636]: E1003 14:01:24.794175 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.799528 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.799581 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.799595 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.799619 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.799632 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:24Z","lastTransitionTime":"2025-10-03T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.902345 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.902558 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.902620 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.902720 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:24 crc kubenswrapper[4636]: I1003 14:01:24.902793 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:24Z","lastTransitionTime":"2025-10-03T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.005089 4636 generic.go:334] "Generic (PLEG): container finished" podID="c2470111-1b59-4048-89ff-2b7e83659200" containerID="091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf" exitCode=0 Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.005295 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" event={"ID":"c2470111-1b59-4048-89ff-2b7e83659200","Type":"ContainerDied","Data":"091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf"} Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.005342 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.005363 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.005372 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.005343 4636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.005385 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.005396 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:25Z","lastTransitionTime":"2025-10-03T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.026239 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.046770 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.064563 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.078709 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.092535 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.108782 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.108819 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.108828 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.108841 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.108849 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:25Z","lastTransitionTime":"2025-10-03T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.108878 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.122607 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.134023 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.149153 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.166619 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.181926 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.197845 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.211722 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.211759 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.211769 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.211784 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.211794 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:25Z","lastTransitionTime":"2025-10-03T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.212749 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.225598 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.313408 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.313437 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.313445 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.313457 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.313468 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:25Z","lastTransitionTime":"2025-10-03T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.415788 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.415830 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.415841 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.415857 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.415869 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:25Z","lastTransitionTime":"2025-10-03T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.429333 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq"] Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.429783 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.431349 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.432244 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.445873 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.458406 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.480136 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.484413 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5045a93a-725e-48c0-b553-2c10569de997-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.484455 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5045a93a-725e-48c0-b553-2c10569de997-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.484480 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8b85\" (UniqueName: \"kubernetes.io/projected/5045a93a-725e-48c0-b553-2c10569de997-kube-api-access-d8b85\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.484502 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5045a93a-725e-48c0-b553-2c10569de997-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.494562 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.505327 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.518411 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.518450 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.518458 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.518473 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.518484 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:25Z","lastTransitionTime":"2025-10-03T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.520477 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.535624 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.544707 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.554469 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.566233 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.577637 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.585392 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8b85\" (UniqueName: \"kubernetes.io/projected/5045a93a-725e-48c0-b553-2c10569de997-kube-api-access-d8b85\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.585430 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5045a93a-725e-48c0-b553-2c10569de997-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.585475 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5045a93a-725e-48c0-b553-2c10569de997-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.585500 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5045a93a-725e-48c0-b553-2c10569de997-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.586495 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5045a93a-725e-48c0-b553-2c10569de997-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.586562 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5045a93a-725e-48c0-b553-2c10569de997-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.590949 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.591435 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5045a93a-725e-48c0-b553-2c10569de997-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.602382 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8b85\" (UniqueName: \"kubernetes.io/projected/5045a93a-725e-48c0-b553-2c10569de997-kube-api-access-d8b85\") pod \"ovnkube-control-plane-749d76644c-j5vpq\" (UID: \"5045a93a-725e-48c0-b553-2c10569de997\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.604281 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.621117 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.621155 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.621163 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.621175 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.621185 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:25Z","lastTransitionTime":"2025-10-03T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.622270 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.632933 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:25Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.723131 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.723433 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.723654 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.723842 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.724058 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:25Z","lastTransitionTime":"2025-10-03T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.746519 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" Oct 03 14:01:25 crc kubenswrapper[4636]: W1003 14:01:25.760205 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5045a93a_725e_48c0_b553_2c10569de997.slice/crio-a3d0a8914cfa2b5787c8802f3563941c104f9046dbf624de4f33ad739bcd1073 WatchSource:0}: Error finding container a3d0a8914cfa2b5787c8802f3563941c104f9046dbf624de4f33ad739bcd1073: Status 404 returned error can't find the container with id a3d0a8914cfa2b5787c8802f3563941c104f9046dbf624de4f33ad739bcd1073 Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.826590 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.826626 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.826635 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.826651 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.826660 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:25Z","lastTransitionTime":"2025-10-03T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.929848 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.929891 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.929905 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.929924 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:25 crc kubenswrapper[4636]: I1003 14:01:25.929938 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:25Z","lastTransitionTime":"2025-10-03T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.008624 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" event={"ID":"5045a93a-725e-48c0-b553-2c10569de997","Type":"ContainerStarted","Data":"a3d0a8914cfa2b5787c8802f3563941c104f9046dbf624de4f33ad739bcd1073"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.011680 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" event={"ID":"c2470111-1b59-4048-89ff-2b7e83659200","Type":"ContainerStarted","Data":"3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.024568 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.032370 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.032403 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.032414 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.032429 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.032440 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:26Z","lastTransitionTime":"2025-10-03T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.038236 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.056691 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.067932 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.078588 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.090644 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.101861 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.114336 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.124943 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.134795 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.134834 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.134845 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.134861 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.134873 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:26Z","lastTransitionTime":"2025-10-03T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.136008 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.144775 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.154432 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.164661 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.176484 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.185182 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.237476 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.237516 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.237527 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.237542 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.237551 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:26Z","lastTransitionTime":"2025-10-03T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.339870 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.339907 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.339917 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.339934 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.339945 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:26Z","lastTransitionTime":"2025-10-03T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.442324 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.442362 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.442375 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.442392 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.442405 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:26Z","lastTransitionTime":"2025-10-03T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.545396 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.545429 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.545437 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.545450 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.545458 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:26Z","lastTransitionTime":"2025-10-03T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.647884 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.647921 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.647932 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.647948 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.647960 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:26Z","lastTransitionTime":"2025-10-03T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.751025 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.751073 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.751086 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.751129 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.751144 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:26Z","lastTransitionTime":"2025-10-03T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.793745 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.793860 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.793949 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:26 crc kubenswrapper[4636]: E1003 14:01:26.794110 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:26 crc kubenswrapper[4636]: E1003 14:01:26.794274 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:26 crc kubenswrapper[4636]: E1003 14:01:26.794436 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.854630 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.854700 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.854719 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.854750 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.854770 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:26Z","lastTransitionTime":"2025-10-03T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.886858 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vm9z7"] Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.887311 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:26 crc kubenswrapper[4636]: E1003 14:01:26.887370 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.913506 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.930241 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.944208 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.958714 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.958771 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.958786 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.958815 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.958834 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:26Z","lastTransitionTime":"2025-10-03T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.963281 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:26 crc kubenswrapper[4636]: I1003 14:01:26.991795 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:26Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.000235 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.000622 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb48b\" (UniqueName: \"kubernetes.io/projected/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-kube-api-access-fb48b\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.011708 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.019276 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/0.log" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.022337 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02" exitCode=1 Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.022394 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02"} Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.022921 4636 scope.go:117] "RemoveContainer" containerID="ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.024820 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" event={"ID":"5045a93a-725e-48c0-b553-2c10569de997","Type":"ContainerStarted","Data":"cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392"} Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.028229 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.047734 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.062363 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.062407 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.062417 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.062433 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.062443 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.070847 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.089149 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.102079 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.102361 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb48b\" (UniqueName: \"kubernetes.io/projected/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-kube-api-access-fb48b\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.102297 4636 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.103174 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs podName:a7f8fb91-fbef-43b5-b771-f376cfbb1cdd nodeName:}" failed. No retries permitted until 2025-10-03 14:01:27.603136453 +0000 UTC m=+37.461862700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs") pod "network-metrics-daemon-vm9z7" (UID: "a7f8fb91-fbef-43b5-b771-f376cfbb1cdd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.108707 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.121545 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.122703 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb48b\" (UniqueName: \"kubernetes.io/projected/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-kube-api-access-fb48b\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.139566 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.152182 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.166629 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.166675 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.166686 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.166703 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.166719 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.169500 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.185429 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.200570 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.219178 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.243851 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.259723 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.269678 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.269724 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.269737 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.269764 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.269782 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.280630 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.299548 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.314470 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.330557 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.346206 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.363231 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.372414 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.372461 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.372472 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.372490 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.372501 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.381414 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.395372 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.411013 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.423492 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.444980 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:01:26.024574 5821 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 14:01:26.024674 5821 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:01:26.024698 5821 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 14:01:26.024715 5821 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:01:26.024730 5821 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 14:01:26.024749 5821 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 14:01:26.024763 5821 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 14:01:26.024767 5821 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 14:01:26.024774 5821 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 14:01:26.024785 5821 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 14:01:26.024793 5821 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 14:01:26.024799 5821 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 14:01:26.024818 5821 factory.go:656] Stopping watch factory\\\\nI1003 14:01:26.024830 5821 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:01:26.024852 5821 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.458598 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.529790 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.530162 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:01:43.530093001 +0000 UTC m=+53.388819248 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.531857 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.531934 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.531951 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.531977 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.531992 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.533503 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.533539 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.533588 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.533612 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.533627 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.546762 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.550557 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.550606 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.550621 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.550645 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.550659 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.562945 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.567825 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.567878 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.567895 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.567918 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.567932 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.581721 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.586523 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.586572 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.586583 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.586606 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.586617 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.602210 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.606739 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.606785 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.606798 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.606819 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.606833 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.622069 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:27Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.622260 4636 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.630496 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.630547 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.630578 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.630599 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.630622 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.630759 4636 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.630820 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:43.630797065 +0000 UTC m=+53.489523312 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631205 4636 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631240 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs podName:a7f8fb91-fbef-43b5-b771-f376cfbb1cdd nodeName:}" failed. No retries permitted until 2025-10-03 14:01:28.631230596 +0000 UTC m=+38.489956843 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs") pod "network-metrics-daemon-vm9z7" (UID: "a7f8fb91-fbef-43b5-b771-f376cfbb1cdd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631280 4636 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631304 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:43.631298108 +0000 UTC m=+53.490024355 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631374 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631440 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631459 4636 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631542 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:43.631514523 +0000 UTC m=+53.490240970 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631394 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631595 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631608 4636 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:27 crc kubenswrapper[4636]: E1003 14:01:27.631645 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:01:43.631636456 +0000 UTC m=+53.490362943 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.635381 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.635414 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.635426 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.635444 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.635497 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.737924 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.737966 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.737978 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.737995 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.738006 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.840838 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.840894 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.840903 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.840917 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.840929 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.942804 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.942966 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.943026 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.943086 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:27 crc kubenswrapper[4636]: I1003 14:01:27.943173 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:27Z","lastTransitionTime":"2025-10-03T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.038058 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/0.log" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.040531 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.042009 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" event={"ID":"5045a93a-725e-48c0-b553-2c10569de997","Type":"ContainerStarted","Data":"79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.045212 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.045231 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.045239 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.045249 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.045257 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:28Z","lastTransitionTime":"2025-10-03T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.147513 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.147553 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.147562 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.147575 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.147584 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:28Z","lastTransitionTime":"2025-10-03T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.252292 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.252378 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.252396 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.252422 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.252439 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:28Z","lastTransitionTime":"2025-10-03T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.355715 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.355759 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.355767 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.355782 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.355791 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:28Z","lastTransitionTime":"2025-10-03T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.458720 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.458790 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.458803 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.458819 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.458830 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:28Z","lastTransitionTime":"2025-10-03T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.561902 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.561986 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.562014 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.562284 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.562315 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:28Z","lastTransitionTime":"2025-10-03T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.640075 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:28 crc kubenswrapper[4636]: E1003 14:01:28.640242 4636 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:28 crc kubenswrapper[4636]: E1003 14:01:28.640328 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs podName:a7f8fb91-fbef-43b5-b771-f376cfbb1cdd nodeName:}" failed. No retries permitted until 2025-10-03 14:01:30.64030926 +0000 UTC m=+40.499035507 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs") pod "network-metrics-daemon-vm9z7" (UID: "a7f8fb91-fbef-43b5-b771-f376cfbb1cdd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.665708 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.665733 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.665741 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.665755 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.665765 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:28Z","lastTransitionTime":"2025-10-03T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.768085 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.768177 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.768191 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.768206 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.768246 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:28Z","lastTransitionTime":"2025-10-03T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.793849 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.793890 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:28 crc kubenswrapper[4636]: E1003 14:01:28.794048 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.794214 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.794229 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:28 crc kubenswrapper[4636]: E1003 14:01:28.794373 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:28 crc kubenswrapper[4636]: E1003 14:01:28.794299 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:28 crc kubenswrapper[4636]: E1003 14:01:28.794621 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.872261 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.872316 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.872331 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.872352 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.872365 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:28Z","lastTransitionTime":"2025-10-03T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.975780 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.976039 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.976133 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.976209 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:28 crc kubenswrapper[4636]: I1003 14:01:28.976274 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:28Z","lastTransitionTime":"2025-10-03T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.045834 4636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.060361 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.075005 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.078958 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.079005 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.079017 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.079036 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.079052 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:29Z","lastTransitionTime":"2025-10-03T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.103250 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:01:26.024574 5821 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 14:01:26.024674 5821 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:01:26.024698 5821 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 14:01:26.024715 5821 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:01:26.024730 5821 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 14:01:26.024749 5821 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 14:01:26.024763 5821 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 14:01:26.024767 5821 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 14:01:26.024774 5821 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 14:01:26.024785 5821 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 14:01:26.024793 5821 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 14:01:26.024799 5821 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 14:01:26.024818 5821 factory.go:656] Stopping watch factory\\\\nI1003 14:01:26.024830 5821 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:01:26.024852 5821 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.117916 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.130065 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.142026 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.154853 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.166037 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.178265 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.181402 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.181431 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.181439 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.181453 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.181462 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:29Z","lastTransitionTime":"2025-10-03T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.190777 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.201508 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.211753 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.221135 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.233721 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.254327 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.268766 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.280359 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.284037 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.284062 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.284070 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.284082 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.284091 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:29Z","lastTransitionTime":"2025-10-03T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.295242 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.312897 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.328306 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.343879 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.361370 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.379484 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.387340 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.387391 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.387404 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.387426 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.387441 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:29Z","lastTransitionTime":"2025-10-03T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.392808 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.406945 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.423827 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.446903 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:01:26.024574 5821 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 14:01:26.024674 5821 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:01:26.024698 5821 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 14:01:26.024715 5821 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:01:26.024730 5821 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 14:01:26.024749 5821 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 14:01:26.024763 5821 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 14:01:26.024767 5821 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 14:01:26.024774 5821 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 14:01:26.024785 5821 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 14:01:26.024793 5821 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 14:01:26.024799 5821 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 14:01:26.024818 5821 factory.go:656] Stopping watch factory\\\\nI1003 14:01:26.024830 5821 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:01:26.024852 5821 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.461521 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.474802 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.489656 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.489702 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.489714 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.489748 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.489758 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:29Z","lastTransitionTime":"2025-10-03T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.491068 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.507307 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.519625 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.592186 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.592266 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.592280 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.592300 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.592315 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:29Z","lastTransitionTime":"2025-10-03T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.695414 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.695456 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.695467 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.695484 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.695494 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:29Z","lastTransitionTime":"2025-10-03T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.794414 4636 scope.go:117] "RemoveContainer" containerID="372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.797519 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.797568 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.797577 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.797595 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.797612 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:29Z","lastTransitionTime":"2025-10-03T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.900684 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.900895 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.900911 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.900937 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:29 crc kubenswrapper[4636]: I1003 14:01:29.900956 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:29Z","lastTransitionTime":"2025-10-03T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.007693 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.007767 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.007781 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.007805 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.007821 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:30Z","lastTransitionTime":"2025-10-03T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.110990 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.111088 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.111097 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.111133 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.111149 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:30Z","lastTransitionTime":"2025-10-03T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.215691 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.215749 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.215763 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.215785 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.215978 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:30Z","lastTransitionTime":"2025-10-03T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.319482 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.319529 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.319538 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.319559 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.319572 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:30Z","lastTransitionTime":"2025-10-03T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.421813 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.421848 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.421856 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.421869 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.421878 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:30Z","lastTransitionTime":"2025-10-03T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.525138 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.525180 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.525192 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.525213 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.525227 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:30Z","lastTransitionTime":"2025-10-03T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.627296 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.627343 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.627358 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.627374 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.627384 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:30Z","lastTransitionTime":"2025-10-03T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.662105 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:30 crc kubenswrapper[4636]: E1003 14:01:30.662280 4636 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:30 crc kubenswrapper[4636]: E1003 14:01:30.662335 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs podName:a7f8fb91-fbef-43b5-b771-f376cfbb1cdd nodeName:}" failed. No retries permitted until 2025-10-03 14:01:34.662319147 +0000 UTC m=+44.521045394 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs") pod "network-metrics-daemon-vm9z7" (UID: "a7f8fb91-fbef-43b5-b771-f376cfbb1cdd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.729190 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.729237 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.729247 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.729262 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.729288 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:30Z","lastTransitionTime":"2025-10-03T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.793756 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.793788 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.793857 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.793769 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:30 crc kubenswrapper[4636]: E1003 14:01:30.793882 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:30 crc kubenswrapper[4636]: E1003 14:01:30.793966 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:30 crc kubenswrapper[4636]: E1003 14:01:30.794120 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:30 crc kubenswrapper[4636]: E1003 14:01:30.794161 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.821065 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.832328 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.832369 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.832379 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.832395 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.832407 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:30Z","lastTransitionTime":"2025-10-03T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.836289 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.846615 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.863748 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.881598 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.894451 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.909128 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.928100 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.934623 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.934686 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.934698 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.934722 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.934739 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:30Z","lastTransitionTime":"2025-10-03T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.954252 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:01:26.024574 5821 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 14:01:26.024674 5821 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:01:26.024698 5821 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 14:01:26.024715 5821 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:01:26.024730 5821 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 14:01:26.024749 5821 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 14:01:26.024763 5821 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 14:01:26.024767 5821 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 14:01:26.024774 5821 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 14:01:26.024785 5821 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 14:01:26.024793 5821 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 14:01:26.024799 5821 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 14:01:26.024818 5821 factory.go:656] Stopping watch factory\\\\nI1003 14:01:26.024830 5821 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:01:26.024852 5821 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.973792 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:30 crc kubenswrapper[4636]: I1003 14:01:30.987916 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.000024 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.012312 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.027186 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.037797 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.037839 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.037850 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.037865 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.037876 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:31Z","lastTransitionTime":"2025-10-03T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.054073 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.056045 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d"} Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.056451 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.058302 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/1.log" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.058975 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/0.log" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.061937 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c" exitCode=1 Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.061977 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c"} Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.062007 4636 scope.go:117] "RemoveContainer" containerID="ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.063005 4636 scope.go:117] "RemoveContainer" containerID="5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c" Oct 03 14:01:31 crc kubenswrapper[4636]: E1003 14:01:31.063198 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.077395 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.091526 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.114889 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.134731 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.139900 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.139934 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.139942 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.139956 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.139965 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:31Z","lastTransitionTime":"2025-10-03T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.148008 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.164341 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.174854 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.189694 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.206559 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.222224 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.238587 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.243310 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.243348 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.243357 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.243372 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.243382 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:31Z","lastTransitionTime":"2025-10-03T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.255697 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.279322 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:01:26.024574 5821 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 14:01:26.024674 5821 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:01:26.024698 5821 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 14:01:26.024715 5821 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:01:26.024730 5821 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 14:01:26.024749 5821 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 14:01:26.024763 5821 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 14:01:26.024767 5821 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 14:01:26.024774 5821 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 14:01:26.024785 5821 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 14:01:26.024793 5821 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 14:01:26.024799 5821 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 14:01:26.024818 5821 factory.go:656] Stopping watch factory\\\\nI1003 14:01:26.024830 5821 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:01:26.024852 5821 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:30Z\\\",\\\"message\\\":\\\"rnetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:01:30.490731 6029 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1003 14:01:30.491712 6029 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.182\\\\\\\", Port:9099, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.293477 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.308820 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.321689 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.334487 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.345768 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.345811 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.345822 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.345839 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.345849 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:31Z","lastTransitionTime":"2025-10-03T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.347901 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.448686 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.448713 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.448725 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.448739 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.448749 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:31Z","lastTransitionTime":"2025-10-03T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.551031 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.551068 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.551076 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.551091 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.551118 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:31Z","lastTransitionTime":"2025-10-03T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.653696 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.653733 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.653743 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.653756 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.653765 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:31Z","lastTransitionTime":"2025-10-03T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.755549 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.755584 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.755594 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.755609 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.755620 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:31Z","lastTransitionTime":"2025-10-03T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.858206 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.858250 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.858261 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.858278 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.858289 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:31Z","lastTransitionTime":"2025-10-03T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.960893 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.960963 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.960976 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.960992 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:31 crc kubenswrapper[4636]: I1003 14:01:31.961005 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:31Z","lastTransitionTime":"2025-10-03T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.063804 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.063842 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.063851 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.063865 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.063874 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:32Z","lastTransitionTime":"2025-10-03T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.066162 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/1.log" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.165911 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.165946 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.165954 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.165970 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.165981 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:32Z","lastTransitionTime":"2025-10-03T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.268479 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.268518 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.268529 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.268545 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.268557 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:32Z","lastTransitionTime":"2025-10-03T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.371121 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.371405 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.371414 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.371427 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.371436 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:32Z","lastTransitionTime":"2025-10-03T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.473355 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.473389 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.473397 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.473412 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.473423 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:32Z","lastTransitionTime":"2025-10-03T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.576062 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.576094 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.576119 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.576131 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.576140 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:32Z","lastTransitionTime":"2025-10-03T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.679252 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.679308 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.679317 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.679334 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.679346 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:32Z","lastTransitionTime":"2025-10-03T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.783603 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.783643 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.783653 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.783670 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.783681 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:32Z","lastTransitionTime":"2025-10-03T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.793410 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.793517 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.793586 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:32 crc kubenswrapper[4636]: E1003 14:01:32.793680 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:32 crc kubenswrapper[4636]: E1003 14:01:32.793806 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.793870 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:32 crc kubenswrapper[4636]: E1003 14:01:32.793903 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:32 crc kubenswrapper[4636]: E1003 14:01:32.794038 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.886614 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.886692 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.886715 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.886744 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.886764 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:32Z","lastTransitionTime":"2025-10-03T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.989385 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.989410 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.989419 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.989432 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:32 crc kubenswrapper[4636]: I1003 14:01:32.989444 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:32Z","lastTransitionTime":"2025-10-03T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.092136 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.092190 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.092205 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.092224 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.092237 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:33Z","lastTransitionTime":"2025-10-03T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.194546 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.194584 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.194599 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.194615 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.194625 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:33Z","lastTransitionTime":"2025-10-03T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.297212 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.297248 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.297266 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.297283 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.297295 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:33Z","lastTransitionTime":"2025-10-03T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.399155 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.399184 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.399192 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.399206 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.399214 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:33Z","lastTransitionTime":"2025-10-03T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.502211 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.502270 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.502281 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.502308 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.502322 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:33Z","lastTransitionTime":"2025-10-03T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.605337 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.605390 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.605399 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.605412 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.605421 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:33Z","lastTransitionTime":"2025-10-03T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.707388 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.707437 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.707451 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.707471 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.707483 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:33Z","lastTransitionTime":"2025-10-03T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.809555 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.809599 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.809610 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.809625 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.809637 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:33Z","lastTransitionTime":"2025-10-03T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.912275 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.912315 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.912325 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.912338 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:33 crc kubenswrapper[4636]: I1003 14:01:33.912348 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:33Z","lastTransitionTime":"2025-10-03T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.014668 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.014712 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.014724 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.014742 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.014754 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:34Z","lastTransitionTime":"2025-10-03T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.117725 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.117809 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.117831 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.117865 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.117888 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:34Z","lastTransitionTime":"2025-10-03T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.220251 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.220293 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.220305 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.220322 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.220332 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:34Z","lastTransitionTime":"2025-10-03T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.323083 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.323156 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.323165 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.323177 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.323201 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:34Z","lastTransitionTime":"2025-10-03T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.425393 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.425440 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.425453 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.425471 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.425481 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:34Z","lastTransitionTime":"2025-10-03T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.528038 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.528077 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.528089 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.528127 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.528138 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:34Z","lastTransitionTime":"2025-10-03T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.647643 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.647682 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.647690 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.647704 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.647713 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:34Z","lastTransitionTime":"2025-10-03T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.698781 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:34 crc kubenswrapper[4636]: E1003 14:01:34.698895 4636 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:34 crc kubenswrapper[4636]: E1003 14:01:34.698962 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs podName:a7f8fb91-fbef-43b5-b771-f376cfbb1cdd nodeName:}" failed. No retries permitted until 2025-10-03 14:01:42.698944475 +0000 UTC m=+52.557670722 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs") pod "network-metrics-daemon-vm9z7" (UID: "a7f8fb91-fbef-43b5-b771-f376cfbb1cdd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.749539 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.749588 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.749597 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.749609 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.749619 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:34Z","lastTransitionTime":"2025-10-03T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.793160 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.793241 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.793175 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.793160 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:34 crc kubenswrapper[4636]: E1003 14:01:34.793367 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:34 crc kubenswrapper[4636]: E1003 14:01:34.793483 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:34 crc kubenswrapper[4636]: E1003 14:01:34.793643 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:34 crc kubenswrapper[4636]: E1003 14:01:34.793719 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.852296 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.852353 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.852756 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.852783 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.852803 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:34Z","lastTransitionTime":"2025-10-03T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.955771 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.955816 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.955827 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.955842 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:34 crc kubenswrapper[4636]: I1003 14:01:34.955854 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:34Z","lastTransitionTime":"2025-10-03T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.058321 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.058360 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.058372 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.058388 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.058400 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:35Z","lastTransitionTime":"2025-10-03T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.160420 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.160466 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.160481 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.160503 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.160517 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:35Z","lastTransitionTime":"2025-10-03T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.262420 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.262530 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.262546 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.262560 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.262571 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:35Z","lastTransitionTime":"2025-10-03T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.364907 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.364952 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.364965 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.364980 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.364990 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:35Z","lastTransitionTime":"2025-10-03T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.466919 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.466949 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.466958 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.466972 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.466980 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:35Z","lastTransitionTime":"2025-10-03T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.570727 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.570773 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.570784 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.570802 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.570815 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:35Z","lastTransitionTime":"2025-10-03T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.673522 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.673561 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.673571 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.673585 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.673594 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:35Z","lastTransitionTime":"2025-10-03T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.776270 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.776308 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.776320 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.776335 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.776345 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:35Z","lastTransitionTime":"2025-10-03T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.879023 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.879075 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.879084 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.879099 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.879132 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:35Z","lastTransitionTime":"2025-10-03T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.981407 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.981436 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.981446 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.981460 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:35 crc kubenswrapper[4636]: I1003 14:01:35.981470 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:35Z","lastTransitionTime":"2025-10-03T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.083537 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.083561 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.083569 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.083581 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.083590 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:36Z","lastTransitionTime":"2025-10-03T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.186039 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.186075 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.186084 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.186124 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.186133 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:36Z","lastTransitionTime":"2025-10-03T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.288456 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.288488 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.288497 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.288513 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.288523 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:36Z","lastTransitionTime":"2025-10-03T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.391886 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.391933 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.391943 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.391959 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.391969 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:36Z","lastTransitionTime":"2025-10-03T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.493659 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.493906 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.493976 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.494040 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.494105 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:36Z","lastTransitionTime":"2025-10-03T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.596259 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.596583 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.596714 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.596806 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.596887 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:36Z","lastTransitionTime":"2025-10-03T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.699947 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.700271 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.700349 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.700445 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.700530 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:36Z","lastTransitionTime":"2025-10-03T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.793164 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:36 crc kubenswrapper[4636]: E1003 14:01:36.793918 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.793243 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:36 crc kubenswrapper[4636]: E1003 14:01:36.794402 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.793293 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:36 crc kubenswrapper[4636]: E1003 14:01:36.794805 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.793164 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:36 crc kubenswrapper[4636]: E1003 14:01:36.795161 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.803380 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.803458 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.803483 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.803514 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.803540 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:36Z","lastTransitionTime":"2025-10-03T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.906114 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.906150 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.906158 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.906170 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:36 crc kubenswrapper[4636]: I1003 14:01:36.906180 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:36Z","lastTransitionTime":"2025-10-03T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.009263 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.009295 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.009311 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.009324 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.009333 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.111760 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.112156 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.112242 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.112339 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.112409 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.215168 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.215268 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.215282 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.215312 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.215327 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.318323 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.318366 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.318374 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.318390 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.318402 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.421241 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.421280 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.421290 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.421309 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.421319 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.523585 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.523626 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.523636 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.523652 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.523668 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.627047 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.627087 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.627098 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.627122 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.627133 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.730783 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.730877 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.730904 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.730940 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.730971 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.796043 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.796137 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.796158 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.796183 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.796203 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: E1003 14:01:37.821283 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:37Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.826647 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.826795 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.826883 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.826971 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.827053 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: E1003 14:01:37.849641 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:37Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.854519 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.854754 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.854940 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.855150 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.855316 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: E1003 14:01:37.882142 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:37Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.888208 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.888289 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.888309 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.888329 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.888378 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: E1003 14:01:37.906020 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:37Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.912056 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.912159 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.912178 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.912206 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.912225 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:37 crc kubenswrapper[4636]: E1003 14:01:37.934319 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:37Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:37 crc kubenswrapper[4636]: E1003 14:01:37.934709 4636 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.937628 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.937706 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.937766 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.937794 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:37 crc kubenswrapper[4636]: I1003 14:01:37.938396 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:37Z","lastTransitionTime":"2025-10-03T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.042545 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.042603 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.042617 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.042651 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.042664 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:38Z","lastTransitionTime":"2025-10-03T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.145819 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.146068 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.146209 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.146310 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.146408 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:38Z","lastTransitionTime":"2025-10-03T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.249418 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.249713 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.249810 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.249951 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.250039 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:38Z","lastTransitionTime":"2025-10-03T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.353728 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.353814 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.353835 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.353873 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.353899 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:38Z","lastTransitionTime":"2025-10-03T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.456953 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.457031 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.457049 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.457077 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.457146 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:38Z","lastTransitionTime":"2025-10-03T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.560694 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.560810 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.560829 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.560857 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.560878 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:38Z","lastTransitionTime":"2025-10-03T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.666060 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.666171 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.666191 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.666220 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.666239 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:38Z","lastTransitionTime":"2025-10-03T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.771602 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.771722 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.771750 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.771826 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.771887 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:38Z","lastTransitionTime":"2025-10-03T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.793151 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.793218 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.793189 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:38 crc kubenswrapper[4636]: E1003 14:01:38.793380 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.793423 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:38 crc kubenswrapper[4636]: E1003 14:01:38.793535 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:38 crc kubenswrapper[4636]: E1003 14:01:38.793623 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:38 crc kubenswrapper[4636]: E1003 14:01:38.793728 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.875641 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.875694 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.875703 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.875723 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.875735 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:38Z","lastTransitionTime":"2025-10-03T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.978790 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.978838 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.978849 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.978867 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:38 crc kubenswrapper[4636]: I1003 14:01:38.978884 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:38Z","lastTransitionTime":"2025-10-03T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.080989 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.081033 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.081045 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.081062 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.081074 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:39Z","lastTransitionTime":"2025-10-03T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.184385 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.184463 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.184480 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.184503 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.184518 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:39Z","lastTransitionTime":"2025-10-03T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.288768 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.288852 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.288870 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.288896 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.288914 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:39Z","lastTransitionTime":"2025-10-03T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.392806 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.392859 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.392869 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.392886 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.392897 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:39Z","lastTransitionTime":"2025-10-03T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.499138 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.499477 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.499622 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.499725 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.499824 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:39Z","lastTransitionTime":"2025-10-03T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.602652 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.602734 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.602746 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.602763 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.602775 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:39Z","lastTransitionTime":"2025-10-03T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.704699 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.704742 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.704753 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.704769 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.704781 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:39Z","lastTransitionTime":"2025-10-03T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.807932 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.807967 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.807976 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.807991 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.808002 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:39Z","lastTransitionTime":"2025-10-03T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.910395 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.910428 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.910436 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.910449 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:39 crc kubenswrapper[4636]: I1003 14:01:39.910459 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:39Z","lastTransitionTime":"2025-10-03T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.013489 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.013529 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.013539 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.013553 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.013562 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:40Z","lastTransitionTime":"2025-10-03T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.116410 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.116483 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.116539 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.116560 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.116573 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:40Z","lastTransitionTime":"2025-10-03T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.219947 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.219982 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.219994 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.220012 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.220021 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:40Z","lastTransitionTime":"2025-10-03T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.323039 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.323134 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.323147 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.323189 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.323203 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:40Z","lastTransitionTime":"2025-10-03T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.425560 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.426038 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.426151 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.426249 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.426323 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:40Z","lastTransitionTime":"2025-10-03T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.528486 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.528519 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.528566 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.528582 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.528590 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:40Z","lastTransitionTime":"2025-10-03T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.631354 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.631611 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.631765 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.631913 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.632049 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:40Z","lastTransitionTime":"2025-10-03T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.734784 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.735050 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.735160 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.735262 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.735374 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:40Z","lastTransitionTime":"2025-10-03T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.792691 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.792743 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.792696 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:40 crc kubenswrapper[4636]: E1003 14:01:40.792865 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.792972 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:40 crc kubenswrapper[4636]: E1003 14:01:40.792969 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:40 crc kubenswrapper[4636]: E1003 14:01:40.793064 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:40 crc kubenswrapper[4636]: E1003 14:01:40.793085 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.806214 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.817629 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.827556 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.835695 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.837206 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.837249 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.837263 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.837280 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.837292 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:40Z","lastTransitionTime":"2025-10-03T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.846443 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.858952 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.876125 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.886662 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.898553 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.911928 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.930915 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac0ffc7d44d1c99317dcab49257065f81caa0658ac771596f7c192e18452bc02\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 14:01:26.024574 5821 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 14:01:26.024674 5821 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 14:01:26.024698 5821 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 14:01:26.024715 5821 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 14:01:26.024730 5821 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 14:01:26.024749 5821 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 14:01:26.024763 5821 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 14:01:26.024767 5821 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 14:01:26.024774 5821 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 14:01:26.024785 5821 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 14:01:26.024793 5821 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 14:01:26.024799 5821 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 14:01:26.024818 5821 factory.go:656] Stopping watch factory\\\\nI1003 14:01:26.024830 5821 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:01:26.024852 5821 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:30Z\\\",\\\"message\\\":\\\"rnetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:01:30.490731 6029 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1003 14:01:30.491712 6029 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.182\\\\\\\", Port:9099, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.939196 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.939242 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.939252 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.939266 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.939276 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:40Z","lastTransitionTime":"2025-10-03T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.946136 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.960818 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.973501 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:40 crc kubenswrapper[4636]: I1003 14:01:40.984748 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.002549 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:40Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.041420 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.041452 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.041463 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.041477 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.041487 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:41Z","lastTransitionTime":"2025-10-03T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.074319 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.075243 4636 scope.go:117] "RemoveContainer" containerID="5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.089417 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.103627 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.121406 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:30Z\\\",\\\"message\\\":\\\"rnetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:01:30.490731 6029 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1003 14:01:30.491712 6029 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.182\\\\\\\", Port:9099, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.132767 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.143906 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.143939 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.143947 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.143960 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.143969 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:41Z","lastTransitionTime":"2025-10-03T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.146803 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.157778 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.170157 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.180121 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.193612 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.206568 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.219162 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.229838 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.241265 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.245727 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.245762 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.245773 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.245792 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.245821 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:41Z","lastTransitionTime":"2025-10-03T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.252976 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.265864 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.276272 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:41Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.348023 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.348069 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.348083 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.348116 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.348128 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:41Z","lastTransitionTime":"2025-10-03T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.450240 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.450269 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.450278 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.450291 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.450301 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:41Z","lastTransitionTime":"2025-10-03T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.552736 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.552793 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.552806 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.552824 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.552837 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:41Z","lastTransitionTime":"2025-10-03T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.655152 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.655190 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.655201 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.655215 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.655226 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:41Z","lastTransitionTime":"2025-10-03T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.757722 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.757761 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.757772 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.757789 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.757800 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:41Z","lastTransitionTime":"2025-10-03T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.859948 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.860217 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.860294 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.860354 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.860410 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:41Z","lastTransitionTime":"2025-10-03T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.965856 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.965901 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.965915 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.965936 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:41 crc kubenswrapper[4636]: I1003 14:01:41.965947 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:41Z","lastTransitionTime":"2025-10-03T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.070046 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.070296 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.070305 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.070318 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.070328 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:42Z","lastTransitionTime":"2025-10-03T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.104995 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/2.log" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.105657 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/1.log" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.108031 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228" exitCode=1 Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.108064 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228"} Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.108111 4636 scope.go:117] "RemoveContainer" containerID="5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.108699 4636 scope.go:117] "RemoveContainer" containerID="22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228" Oct 03 14:01:42 crc kubenswrapper[4636]: E1003 14:01:42.108902 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.124946 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.139230 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.150656 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.164350 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.172345 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.172395 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.172406 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.172422 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.172433 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:42Z","lastTransitionTime":"2025-10-03T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.177416 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.196807 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:30Z\\\",\\\"message\\\":\\\"rnetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:01:30.490731 6029 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1003 14:01:30.491712 6029 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.182\\\\\\\", Port:9099, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:42Z\\\",\\\"message\\\":\\\"kube-controller-manager-crc openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-t7xd5 openshift-multus/multus-additional-cni-plugins-lbt25 openshift-image-registry/node-ca-xf7xs openshift-multus/network-metrics-daemon-vm9z7 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-dns/node-resolver-r9xm2]\\\\nI1003 14:01:42.006517 6219 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 14:01:42.006530 6219 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006537 6219 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006543 6219 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-r9xm2 in node crc\\\\nI1003 14:01:42.006547 6219 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-r9xm2 after 0 failed attempt(s)\\\\nI1003 14:01:42.006552 6219 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006565 6219 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:01:42.006623 6219 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.207682 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.221248 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.235676 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.249056 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.262748 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.274431 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.274472 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.274481 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.274495 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.274504 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:42Z","lastTransitionTime":"2025-10-03T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.277450 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.288048 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.299766 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.307840 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.318171 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:42Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.377906 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.377944 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.377955 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.377970 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.377981 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:42Z","lastTransitionTime":"2025-10-03T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.479780 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.479816 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.479828 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.479844 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.479854 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:42Z","lastTransitionTime":"2025-10-03T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.582087 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.582135 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.582147 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.582162 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.582173 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:42Z","lastTransitionTime":"2025-10-03T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.684771 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.684814 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.684823 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.684837 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.684847 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:42Z","lastTransitionTime":"2025-10-03T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.783189 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:42 crc kubenswrapper[4636]: E1003 14:01:42.783377 4636 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:42 crc kubenswrapper[4636]: E1003 14:01:42.783617 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs podName:a7f8fb91-fbef-43b5-b771-f376cfbb1cdd nodeName:}" failed. No retries permitted until 2025-10-03 14:01:58.783598146 +0000 UTC m=+68.642324393 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs") pod "network-metrics-daemon-vm9z7" (UID: "a7f8fb91-fbef-43b5-b771-f376cfbb1cdd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.787497 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.787552 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.787561 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.787576 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.787587 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:42Z","lastTransitionTime":"2025-10-03T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.793629 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.793651 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:42 crc kubenswrapper[4636]: E1003 14:01:42.793794 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.793891 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.793971 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:42 crc kubenswrapper[4636]: E1003 14:01:42.793926 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:42 crc kubenswrapper[4636]: E1003 14:01:42.794153 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:42 crc kubenswrapper[4636]: E1003 14:01:42.794251 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.889853 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.889887 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.889900 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.889914 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.889925 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:42Z","lastTransitionTime":"2025-10-03T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.992349 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.992401 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.992412 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.992425 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:42 crc kubenswrapper[4636]: I1003 14:01:42.992435 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:42Z","lastTransitionTime":"2025-10-03T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.095010 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.095053 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.095065 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.095080 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.095092 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:43Z","lastTransitionTime":"2025-10-03T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.111981 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/2.log" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.197758 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.197999 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.198076 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.198166 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.198240 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:43Z","lastTransitionTime":"2025-10-03T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.300221 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.300445 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.300540 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.300611 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.300678 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:43Z","lastTransitionTime":"2025-10-03T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.402764 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.402804 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.402812 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.402827 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.402839 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:43Z","lastTransitionTime":"2025-10-03T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.504748 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.504812 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.504822 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.504835 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.504843 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:43Z","lastTransitionTime":"2025-10-03T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.591589 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.591816 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:02:15.591782525 +0000 UTC m=+85.450508772 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.607265 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.607308 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.607316 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.607332 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.607343 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:43Z","lastTransitionTime":"2025-10-03T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.692556 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.692612 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.692633 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.692657 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692741 4636 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692795 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692755 4636 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692794 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692865 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692877 4636 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692815 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:02:15.692797187 +0000 UTC m=+85.551523434 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692819 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692923 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:02:15.69290635 +0000 UTC m=+85.551632597 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692924 4636 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692947 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:02:15.692938991 +0000 UTC m=+85.551665368 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:43 crc kubenswrapper[4636]: E1003 14:01:43.692989 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:02:15.692973012 +0000 UTC m=+85.551699259 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.709121 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.709160 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.709172 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.709188 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.709199 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:43Z","lastTransitionTime":"2025-10-03T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.811218 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.811293 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.811302 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.811315 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.811325 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:43Z","lastTransitionTime":"2025-10-03T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.914328 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.914361 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.914369 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.914383 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:43 crc kubenswrapper[4636]: I1003 14:01:43.914401 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:43Z","lastTransitionTime":"2025-10-03T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.016531 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.016605 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.016623 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.016640 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.016650 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:44Z","lastTransitionTime":"2025-10-03T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.117774 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.117827 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.117836 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.117848 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.117856 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:44Z","lastTransitionTime":"2025-10-03T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.219964 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.220015 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.220025 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.220040 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.220052 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:44Z","lastTransitionTime":"2025-10-03T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.322345 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.322454 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.322479 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.322509 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.322530 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:44Z","lastTransitionTime":"2025-10-03T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.424962 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.425020 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.425031 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.425045 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.425054 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:44Z","lastTransitionTime":"2025-10-03T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.527192 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.527274 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.527297 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.527327 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.527353 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:44Z","lastTransitionTime":"2025-10-03T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.629324 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.629369 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.629378 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.629391 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.629399 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:44Z","lastTransitionTime":"2025-10-03T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.731752 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.731791 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.731802 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.731817 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.731828 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:44Z","lastTransitionTime":"2025-10-03T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.793775 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.793775 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.793872 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:44 crc kubenswrapper[4636]: E1003 14:01:44.793977 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.794054 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:44 crc kubenswrapper[4636]: E1003 14:01:44.794230 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:44 crc kubenswrapper[4636]: E1003 14:01:44.794384 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:44 crc kubenswrapper[4636]: E1003 14:01:44.794450 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.834167 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.834204 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.834213 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.834225 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.834234 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:44Z","lastTransitionTime":"2025-10-03T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.936133 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.936166 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.936178 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.936194 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:44 crc kubenswrapper[4636]: I1003 14:01:44.936204 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:44Z","lastTransitionTime":"2025-10-03T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.038720 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.038787 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.038800 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.038812 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.038821 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:45Z","lastTransitionTime":"2025-10-03T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.081565 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.090307 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.095347 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.106683 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.115418 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.124383 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.136884 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.140513 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.140546 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.140557 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.140573 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.140584 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:45Z","lastTransitionTime":"2025-10-03T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.147567 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.160493 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.174031 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.198936 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:30Z\\\",\\\"message\\\":\\\"rnetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:01:30.490731 6029 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1003 14:01:30.491712 6029 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.182\\\\\\\", Port:9099, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:42Z\\\",\\\"message\\\":\\\"kube-controller-manager-crc openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-t7xd5 openshift-multus/multus-additional-cni-plugins-lbt25 openshift-image-registry/node-ca-xf7xs openshift-multus/network-metrics-daemon-vm9z7 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-dns/node-resolver-r9xm2]\\\\nI1003 14:01:42.006517 6219 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 14:01:42.006530 6219 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006537 6219 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006543 6219 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-r9xm2 in node crc\\\\nI1003 14:01:42.006547 6219 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-r9xm2 after 0 failed attempt(s)\\\\nI1003 14:01:42.006552 6219 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006565 6219 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:01:42.006623 6219 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.209655 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.223912 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.236369 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.242904 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.242940 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.242948 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.242962 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.242974 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:45Z","lastTransitionTime":"2025-10-03T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.249266 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.260967 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.272757 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.284162 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:45Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.345700 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.345733 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.345741 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.345755 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.345765 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:45Z","lastTransitionTime":"2025-10-03T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.450012 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.450049 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.450060 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.450075 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.450086 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:45Z","lastTransitionTime":"2025-10-03T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.552994 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.553035 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.553046 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.553064 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.553075 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:45Z","lastTransitionTime":"2025-10-03T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.655161 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.655190 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.655198 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.655210 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.655219 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:45Z","lastTransitionTime":"2025-10-03T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.757759 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.757795 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.757803 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.757817 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.757825 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:45Z","lastTransitionTime":"2025-10-03T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.859824 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.859868 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.859878 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.859892 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.859901 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:45Z","lastTransitionTime":"2025-10-03T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.962531 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.962567 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.962575 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.962588 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:45 crc kubenswrapper[4636]: I1003 14:01:45.962597 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:45Z","lastTransitionTime":"2025-10-03T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.064214 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.064250 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.064259 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.064272 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.064281 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:46Z","lastTransitionTime":"2025-10-03T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.166314 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.166350 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.166358 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.166372 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.166382 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:46Z","lastTransitionTime":"2025-10-03T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.268796 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.268839 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.268850 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.268864 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.268873 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:46Z","lastTransitionTime":"2025-10-03T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.370708 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.370737 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.370746 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.370757 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.370766 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:46Z","lastTransitionTime":"2025-10-03T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.473147 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.473190 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.473201 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.473215 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.473226 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:46Z","lastTransitionTime":"2025-10-03T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.575381 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.575419 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.575432 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.575447 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.575457 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:46Z","lastTransitionTime":"2025-10-03T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.677608 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.677644 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.677651 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.677664 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.677673 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:46Z","lastTransitionTime":"2025-10-03T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.779979 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.780023 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.780034 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.780051 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.780063 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:46Z","lastTransitionTime":"2025-10-03T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.793560 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.793637 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:46 crc kubenswrapper[4636]: E1003 14:01:46.793660 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:46 crc kubenswrapper[4636]: E1003 14:01:46.793773 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.793841 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:46 crc kubenswrapper[4636]: E1003 14:01:46.793900 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.794002 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:46 crc kubenswrapper[4636]: E1003 14:01:46.794061 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.882308 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.882347 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.882358 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.882376 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.882388 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:46Z","lastTransitionTime":"2025-10-03T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.984199 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.984233 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.984243 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.984257 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:46 crc kubenswrapper[4636]: I1003 14:01:46.984267 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:46Z","lastTransitionTime":"2025-10-03T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.086901 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.086939 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.086946 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.086961 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.086969 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:47Z","lastTransitionTime":"2025-10-03T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.189430 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.189458 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.189466 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.189478 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.189487 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:47Z","lastTransitionTime":"2025-10-03T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.291786 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.291818 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.291825 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.291838 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.291846 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:47Z","lastTransitionTime":"2025-10-03T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.394256 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.394290 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.394298 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.394312 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.394320 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:47Z","lastTransitionTime":"2025-10-03T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.496350 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.496385 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.496395 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.496409 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.496417 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:47Z","lastTransitionTime":"2025-10-03T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.598473 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.598506 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.598515 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.598528 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.598537 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:47Z","lastTransitionTime":"2025-10-03T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.700382 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.700414 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.700422 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.700434 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.700442 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:47Z","lastTransitionTime":"2025-10-03T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.803237 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.803266 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.803275 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.803287 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.803296 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:47Z","lastTransitionTime":"2025-10-03T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.905728 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.905762 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.905773 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.905789 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:47 crc kubenswrapper[4636]: I1003 14:01:47.905800 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:47Z","lastTransitionTime":"2025-10-03T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.007699 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.007735 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.007752 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.007768 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.007780 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.110317 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.110353 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.110364 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.110378 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.110389 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.212497 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.212543 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.212552 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.212567 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.212578 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.233079 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.233136 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.233148 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.233166 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.233181 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: E1003 14:01:48.244364 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:48Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.248991 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.249031 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.249041 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.249058 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.249070 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: E1003 14:01:48.260934 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:48Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.264449 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.264481 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.264489 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.264503 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.264512 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: E1003 14:01:48.277010 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:48Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.280838 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.280873 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.280881 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.280896 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.280906 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: E1003 14:01:48.296290 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:48Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.299513 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.299538 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.299548 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.299563 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.299573 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: E1003 14:01:48.310332 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:48Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:48 crc kubenswrapper[4636]: E1003 14:01:48.310477 4636 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.314742 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.314766 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.314776 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.314791 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.314801 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.416925 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.416964 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.416973 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.416986 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.416997 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.519031 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.519080 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.519115 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.519138 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.519153 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.621324 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.621359 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.621370 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.621384 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.621392 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.723000 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.723040 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.723053 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.723076 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.723115 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.793700 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.793725 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.793712 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.793839 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:48 crc kubenswrapper[4636]: E1003 14:01:48.793835 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:48 crc kubenswrapper[4636]: E1003 14:01:48.793884 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:48 crc kubenswrapper[4636]: E1003 14:01:48.793932 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:48 crc kubenswrapper[4636]: E1003 14:01:48.794045 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.825055 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.825086 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.825116 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.825131 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.825140 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.927337 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.927370 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.927379 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.927392 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:48 crc kubenswrapper[4636]: I1003 14:01:48.927401 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:48Z","lastTransitionTime":"2025-10-03T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.029360 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.029452 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.029468 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.029484 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.029496 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:49Z","lastTransitionTime":"2025-10-03T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.131706 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.131991 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.132307 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.132440 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.132591 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:49Z","lastTransitionTime":"2025-10-03T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.234965 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.235028 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.235056 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.235069 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.235079 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:49Z","lastTransitionTime":"2025-10-03T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.337637 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.337675 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.337684 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.337696 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.337705 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:49Z","lastTransitionTime":"2025-10-03T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.407835 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.427407 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.438172 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.440925 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.440961 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.440973 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.440987 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.440998 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:49Z","lastTransitionTime":"2025-10-03T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.449568 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.461342 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.471851 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.481167 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.492928 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.505686 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.515720 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.527431 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.538072 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.543245 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.543281 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.543290 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.543305 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.543314 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:49Z","lastTransitionTime":"2025-10-03T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.555380 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:30Z\\\",\\\"message\\\":\\\"rnetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:01:30.490731 6029 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1003 14:01:30.491712 6029 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.182\\\\\\\", Port:9099, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:42Z\\\",\\\"message\\\":\\\"kube-controller-manager-crc openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-t7xd5 openshift-multus/multus-additional-cni-plugins-lbt25 openshift-image-registry/node-ca-xf7xs openshift-multus/network-metrics-daemon-vm9z7 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-dns/node-resolver-r9xm2]\\\\nI1003 14:01:42.006517 6219 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 14:01:42.006530 6219 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006537 6219 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006543 6219 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-r9xm2 in node crc\\\\nI1003 14:01:42.006547 6219 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-r9xm2 after 0 failed attempt(s)\\\\nI1003 14:01:42.006552 6219 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006565 6219 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:01:42.006623 6219 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.565323 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.577045 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.589461 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.602782 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.614179 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:49Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.647889 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.647932 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.647944 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.647960 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.647971 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:49Z","lastTransitionTime":"2025-10-03T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.749589 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.749627 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.749643 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.749658 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.749667 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:49Z","lastTransitionTime":"2025-10-03T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.852215 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.852273 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.852285 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.852300 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.852312 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:49Z","lastTransitionTime":"2025-10-03T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.954993 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.955048 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.955087 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.955117 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:49 crc kubenswrapper[4636]: I1003 14:01:49.955142 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:49Z","lastTransitionTime":"2025-10-03T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.057676 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.057713 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.057725 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.057738 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.057750 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:50Z","lastTransitionTime":"2025-10-03T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.159728 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.159769 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.159779 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.159795 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.159807 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:50Z","lastTransitionTime":"2025-10-03T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.261453 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.261497 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.261509 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.261525 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.261535 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:50Z","lastTransitionTime":"2025-10-03T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.363674 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.363713 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.363725 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.363742 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.363751 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:50Z","lastTransitionTime":"2025-10-03T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.465903 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.465941 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.465951 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.465965 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.465977 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:50Z","lastTransitionTime":"2025-10-03T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.567714 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.567755 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.567773 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.567791 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.567801 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:50Z","lastTransitionTime":"2025-10-03T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.669531 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.669568 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.669578 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.669592 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.669600 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:50Z","lastTransitionTime":"2025-10-03T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.775727 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.776050 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.776062 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.776076 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.776090 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:50Z","lastTransitionTime":"2025-10-03T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.792879 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.792879 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.792927 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.792989 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:50 crc kubenswrapper[4636]: E1003 14:01:50.793135 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:50 crc kubenswrapper[4636]: E1003 14:01:50.793196 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:50 crc kubenswrapper[4636]: E1003 14:01:50.793259 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:50 crc kubenswrapper[4636]: E1003 14:01:50.793320 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.806200 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.818323 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.842412 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.852702 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.861559 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.871808 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.878450 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.878495 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.878507 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.878523 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.878534 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:50Z","lastTransitionTime":"2025-10-03T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.884811 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.897698 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.907296 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.926116 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.940365 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.958519 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cdbb9faa17368d9a0a2c16db6078c888138ee5270a30baf9a7ca1a9a617cd2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:30Z\\\",\\\"message\\\":\\\"rnetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 14:01:30.490731 6029 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1003 14:01:30.491712 6029 services_controller.go:451] Built service openshift-cluster-version/cluster-version-operator cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-cluster-version/cluster-version-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.182\\\\\\\", Port:9099, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:42Z\\\",\\\"message\\\":\\\"kube-controller-manager-crc openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-t7xd5 openshift-multus/multus-additional-cni-plugins-lbt25 openshift-image-registry/node-ca-xf7xs openshift-multus/network-metrics-daemon-vm9z7 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-dns/node-resolver-r9xm2]\\\\nI1003 14:01:42.006517 6219 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 14:01:42.006530 6219 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006537 6219 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006543 6219 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-r9xm2 in node crc\\\\nI1003 14:01:42.006547 6219 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-r9xm2 after 0 failed attempt(s)\\\\nI1003 14:01:42.006552 6219 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006565 6219 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:01:42.006623 6219 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.978405 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.982630 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.982662 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.982672 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.982686 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.982697 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:50Z","lastTransitionTime":"2025-10-03T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:50 crc kubenswrapper[4636]: I1003 14:01:50.997233 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:50Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.027145 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:51Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.042353 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:51Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.052603 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:51Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.085321 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.085359 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.085369 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.085383 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.085392 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:51Z","lastTransitionTime":"2025-10-03T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.188634 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.188671 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.188683 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.188697 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.188706 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:51Z","lastTransitionTime":"2025-10-03T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.295948 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.295989 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.296000 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.296020 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.296030 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:51Z","lastTransitionTime":"2025-10-03T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.398407 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.398439 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.398456 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.398471 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.398483 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:51Z","lastTransitionTime":"2025-10-03T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.500892 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.500950 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.500969 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.500996 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.501014 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:51Z","lastTransitionTime":"2025-10-03T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.604461 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.604522 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.604537 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.604559 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.604575 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:51Z","lastTransitionTime":"2025-10-03T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.708508 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.708560 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.708568 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.708586 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.708596 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:51Z","lastTransitionTime":"2025-10-03T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.812994 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.813054 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.813073 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.813123 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.813142 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:51Z","lastTransitionTime":"2025-10-03T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.915294 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.915349 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.915363 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.915379 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:51 crc kubenswrapper[4636]: I1003 14:01:51.915390 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:51Z","lastTransitionTime":"2025-10-03T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.017634 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.017688 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.017699 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.017720 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.017735 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:52Z","lastTransitionTime":"2025-10-03T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.120195 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.120255 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.120263 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.120278 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.120290 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:52Z","lastTransitionTime":"2025-10-03T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.228609 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.228655 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.228668 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.228687 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.228702 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:52Z","lastTransitionTime":"2025-10-03T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.331147 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.331204 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.331224 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.331240 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.331251 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:52Z","lastTransitionTime":"2025-10-03T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.433949 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.433995 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.434004 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.434020 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.434030 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:52Z","lastTransitionTime":"2025-10-03T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.536426 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.536476 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.536487 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.536505 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.536517 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:52Z","lastTransitionTime":"2025-10-03T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.638507 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.638779 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.638979 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.639000 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.639013 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:52Z","lastTransitionTime":"2025-10-03T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.742515 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.742545 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.742555 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.742569 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.742578 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:52Z","lastTransitionTime":"2025-10-03T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.795576 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:52 crc kubenswrapper[4636]: E1003 14:01:52.809561 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.809838 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.809880 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.809894 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:52 crc kubenswrapper[4636]: E1003 14:01:52.810079 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:52 crc kubenswrapper[4636]: E1003 14:01:52.810382 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:52 crc kubenswrapper[4636]: E1003 14:01:52.810538 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.845538 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.845570 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.845579 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.845594 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.845607 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:52Z","lastTransitionTime":"2025-10-03T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.948115 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.948718 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.948795 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.948877 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:52 crc kubenswrapper[4636]: I1003 14:01:52.948943 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:52Z","lastTransitionTime":"2025-10-03T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.050959 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.051006 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.051016 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.051033 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.051045 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:53Z","lastTransitionTime":"2025-10-03T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.153456 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.153493 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.153504 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.153518 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.153530 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:53Z","lastTransitionTime":"2025-10-03T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.256008 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.256491 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.256610 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.256803 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.256922 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:53Z","lastTransitionTime":"2025-10-03T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.360013 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.360068 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.360077 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.360111 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.360131 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:53Z","lastTransitionTime":"2025-10-03T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.463137 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.463177 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.463189 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.463205 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.463215 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:53Z","lastTransitionTime":"2025-10-03T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.566423 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.566774 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.566877 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.566969 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.567048 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:53Z","lastTransitionTime":"2025-10-03T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.669052 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.669124 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.669145 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.669666 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.669702 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:53Z","lastTransitionTime":"2025-10-03T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.772781 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.772815 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.772823 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.772837 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.772847 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:53Z","lastTransitionTime":"2025-10-03T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.877358 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.877416 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.877433 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.877458 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.877473 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:53Z","lastTransitionTime":"2025-10-03T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.981175 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.981222 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.981233 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.981255 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:53 crc kubenswrapper[4636]: I1003 14:01:53.981269 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:53Z","lastTransitionTime":"2025-10-03T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.085352 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.085401 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.085416 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.085437 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.085452 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:54Z","lastTransitionTime":"2025-10-03T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.190005 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.190088 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.190184 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.190221 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.190248 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:54Z","lastTransitionTime":"2025-10-03T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.295251 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.295304 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.295315 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.295332 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.295344 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:54Z","lastTransitionTime":"2025-10-03T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.399233 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.399309 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.399327 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.399355 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.399377 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:54Z","lastTransitionTime":"2025-10-03T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.503007 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.503065 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.503077 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.503119 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.503151 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:54Z","lastTransitionTime":"2025-10-03T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.610049 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.610168 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.610197 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.610231 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.610258 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:54Z","lastTransitionTime":"2025-10-03T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.713717 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.713808 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.713833 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.713872 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.713900 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:54Z","lastTransitionTime":"2025-10-03T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.793243 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.793326 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:54 crc kubenswrapper[4636]: E1003 14:01:54.793397 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.793551 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.793712 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:54 crc kubenswrapper[4636]: E1003 14:01:54.796459 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:54 crc kubenswrapper[4636]: E1003 14:01:54.793828 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:54 crc kubenswrapper[4636]: E1003 14:01:54.796685 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.816625 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.816676 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.816688 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.816703 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.816716 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:54Z","lastTransitionTime":"2025-10-03T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.923511 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.923797 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.923862 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.923943 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:54 crc kubenswrapper[4636]: I1003 14:01:54.924010 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:54Z","lastTransitionTime":"2025-10-03T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.027175 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.027216 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.027225 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.027244 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.027253 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:55Z","lastTransitionTime":"2025-10-03T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.130871 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.130918 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.130928 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.130943 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.130953 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:55Z","lastTransitionTime":"2025-10-03T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.240617 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.240684 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.240701 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.240731 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.240753 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:55Z","lastTransitionTime":"2025-10-03T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.344324 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.344707 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.344861 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.345007 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.345280 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:55Z","lastTransitionTime":"2025-10-03T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.449326 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.449367 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.449378 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.449395 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.449404 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:55Z","lastTransitionTime":"2025-10-03T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.553487 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.553534 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.553543 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.553558 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.553570 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:55Z","lastTransitionTime":"2025-10-03T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.655970 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.656547 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.656827 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.657413 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.657618 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:55Z","lastTransitionTime":"2025-10-03T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.762061 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.762132 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.762144 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.762161 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.762173 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:55Z","lastTransitionTime":"2025-10-03T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.864695 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.864740 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.864751 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.864766 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.864776 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:55Z","lastTransitionTime":"2025-10-03T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.967305 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.967352 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.967362 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.967383 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:55 crc kubenswrapper[4636]: I1003 14:01:55.967395 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:55Z","lastTransitionTime":"2025-10-03T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.070093 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.070167 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.070183 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.070203 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.070218 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:56Z","lastTransitionTime":"2025-10-03T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.173010 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.173071 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.173080 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.173117 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.173131 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:56Z","lastTransitionTime":"2025-10-03T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.281346 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.281392 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.281403 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.281417 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.281427 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:56Z","lastTransitionTime":"2025-10-03T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.384016 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.384086 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.384121 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.384146 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.384163 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:56Z","lastTransitionTime":"2025-10-03T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.486088 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.486314 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.486392 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.486462 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.486523 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:56Z","lastTransitionTime":"2025-10-03T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.588508 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.588562 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.588581 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.588606 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.588643 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:56Z","lastTransitionTime":"2025-10-03T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.690350 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.690404 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.690413 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.690432 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.690445 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:56Z","lastTransitionTime":"2025-10-03T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.792213 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.792239 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.792250 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.792265 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.792273 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:56Z","lastTransitionTime":"2025-10-03T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.792840 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.792879 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.792923 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.793005 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:56 crc kubenswrapper[4636]: E1003 14:01:56.793145 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:56 crc kubenswrapper[4636]: E1003 14:01:56.793230 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:56 crc kubenswrapper[4636]: E1003 14:01:56.793399 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:56 crc kubenswrapper[4636]: E1003 14:01:56.793454 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.894548 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.894580 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.894589 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.894602 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.894610 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:56Z","lastTransitionTime":"2025-10-03T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.998537 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.998617 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.998635 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.998661 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:56 crc kubenswrapper[4636]: I1003 14:01:56.998681 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:56Z","lastTransitionTime":"2025-10-03T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.100631 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.100669 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.100680 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.100697 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.100709 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:57Z","lastTransitionTime":"2025-10-03T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.202809 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.202874 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.202888 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.202909 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.202920 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:57Z","lastTransitionTime":"2025-10-03T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.305696 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.305725 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.305735 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.305750 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.305759 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:57Z","lastTransitionTime":"2025-10-03T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.407858 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.407895 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.407904 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.407923 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.407933 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:57Z","lastTransitionTime":"2025-10-03T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.510874 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.510917 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.510927 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.510943 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.510953 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:57Z","lastTransitionTime":"2025-10-03T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.613356 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.613408 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.613421 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.613435 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.613444 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:57Z","lastTransitionTime":"2025-10-03T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.715783 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.715875 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.715892 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.715909 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.715919 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:57Z","lastTransitionTime":"2025-10-03T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.795785 4636 scope.go:117] "RemoveContainer" containerID="22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228" Oct 03 14:01:57 crc kubenswrapper[4636]: E1003 14:01:57.795961 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.811771 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.818048 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.818087 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.818117 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.818136 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.818149 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:57Z","lastTransitionTime":"2025-10-03T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.827952 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.841837 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.856467 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.870802 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.883245 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.897761 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.913611 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.920222 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.920247 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.920254 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.920269 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.920280 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:57Z","lastTransitionTime":"2025-10-03T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.925361 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.940065 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.953022 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.969240 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:42Z\\\",\\\"message\\\":\\\"kube-controller-manager-crc openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-t7xd5 openshift-multus/multus-additional-cni-plugins-lbt25 openshift-image-registry/node-ca-xf7xs openshift-multus/network-metrics-daemon-vm9z7 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-dns/node-resolver-r9xm2]\\\\nI1003 14:01:42.006517 6219 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 14:01:42.006530 6219 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006537 6219 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006543 6219 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-r9xm2 in node crc\\\\nI1003 14:01:42.006547 6219 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-r9xm2 after 0 failed attempt(s)\\\\nI1003 14:01:42.006552 6219 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006565 6219 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:01:42.006623 6219 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.979984 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:57 crc kubenswrapper[4636]: I1003 14:01:57.991481 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:57Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.011567 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:58Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.023341 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.023380 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.023390 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.023405 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.023414 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.023414 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:58Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.035743 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:58Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.125605 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.125643 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.125652 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.125667 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.125676 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.227948 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.227985 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.227993 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.228007 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.228016 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.330339 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.330379 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.330394 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.330417 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.330432 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.404362 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.404413 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.404455 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.404489 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.404503 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.417196 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:58Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.420842 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.420875 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.420885 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.420900 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.420912 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.435075 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:58Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.438071 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.438109 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.438119 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.438136 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.438145 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.454393 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:58Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.457994 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.458039 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.458050 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.458067 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.458082 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.471387 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:58Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.475363 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.475413 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.475423 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.475437 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.475447 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.489650 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:01:58Z is after 2025-08-24T17:21:41Z" Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.489813 4636 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.491583 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.491630 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.491640 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.491657 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.491667 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.593998 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.594053 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.594063 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.594078 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.594090 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.697164 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.697239 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.697254 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.697275 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.697285 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.793467 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.793541 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.793623 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.793720 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.793801 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.793846 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.793968 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.794019 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.799959 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.800010 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.800020 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.800033 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.800042 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.869700 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.869953 4636 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:58 crc kubenswrapper[4636]: E1003 14:01:58.870070 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs podName:a7f8fb91-fbef-43b5-b771-f376cfbb1cdd nodeName:}" failed. No retries permitted until 2025-10-03 14:02:30.870038519 +0000 UTC m=+100.728764806 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs") pod "network-metrics-daemon-vm9z7" (UID: "a7f8fb91-fbef-43b5-b771-f376cfbb1cdd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.903309 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.903351 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.903362 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.903377 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:58 crc kubenswrapper[4636]: I1003 14:01:58.903387 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:58Z","lastTransitionTime":"2025-10-03T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.006397 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.006432 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.006443 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.006458 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.006468 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:59Z","lastTransitionTime":"2025-10-03T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.108905 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.108949 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.108959 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.108975 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.108987 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:59Z","lastTransitionTime":"2025-10-03T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.210960 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.211009 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.211024 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.211044 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.211056 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:59Z","lastTransitionTime":"2025-10-03T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.321278 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.321321 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.321329 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.321342 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.321353 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:59Z","lastTransitionTime":"2025-10-03T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.423880 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.424019 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.424030 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.424047 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.424056 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:59Z","lastTransitionTime":"2025-10-03T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.526739 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.526786 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.526799 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.526816 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.526826 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:59Z","lastTransitionTime":"2025-10-03T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.628939 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.628984 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.628995 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.629011 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.629023 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:59Z","lastTransitionTime":"2025-10-03T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.731371 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.731410 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.731418 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.731433 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.731444 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:59Z","lastTransitionTime":"2025-10-03T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.834233 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.834284 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.834301 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.834319 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.834329 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:59Z","lastTransitionTime":"2025-10-03T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.937022 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.937065 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.937073 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.937088 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:01:59 crc kubenswrapper[4636]: I1003 14:01:59.937110 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:01:59Z","lastTransitionTime":"2025-10-03T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.039501 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.039805 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.039888 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.039977 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.040087 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:00Z","lastTransitionTime":"2025-10-03T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.142873 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.142919 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.142931 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.142952 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.142964 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:00Z","lastTransitionTime":"2025-10-03T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.245015 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.245071 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.245089 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.245141 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.245159 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:00Z","lastTransitionTime":"2025-10-03T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.347849 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.347890 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.347900 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.347916 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.347926 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:00Z","lastTransitionTime":"2025-10-03T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.450229 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.450489 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.450595 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.450745 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.450838 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:00Z","lastTransitionTime":"2025-10-03T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.553327 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.553363 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.553377 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.553394 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.553406 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:00Z","lastTransitionTime":"2025-10-03T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.656725 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.656780 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.656796 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.656823 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.656840 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:00Z","lastTransitionTime":"2025-10-03T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.758919 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.758968 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.758979 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.758997 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.759006 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:00Z","lastTransitionTime":"2025-10-03T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.793627 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.793647 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.793673 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.793664 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:00 crc kubenswrapper[4636]: E1003 14:02:00.793757 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:00 crc kubenswrapper[4636]: E1003 14:02:00.793860 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:00 crc kubenswrapper[4636]: E1003 14:02:00.793943 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:00 crc kubenswrapper[4636]: E1003 14:02:00.794008 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.809051 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.819818 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.834750 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.848059 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.861559 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.861883 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.862421 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.862550 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.862719 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:00Z","lastTransitionTime":"2025-10-03T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.867150 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:42Z\\\",\\\"message\\\":\\\"kube-controller-manager-crc openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-t7xd5 openshift-multus/multus-additional-cni-plugins-lbt25 openshift-image-registry/node-ca-xf7xs openshift-multus/network-metrics-daemon-vm9z7 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-dns/node-resolver-r9xm2]\\\\nI1003 14:01:42.006517 6219 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 14:01:42.006530 6219 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006537 6219 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006543 6219 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-r9xm2 in node crc\\\\nI1003 14:01:42.006547 6219 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-r9xm2 after 0 failed attempt(s)\\\\nI1003 14:01:42.006552 6219 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006565 6219 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:01:42.006623 6219 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.878637 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.890365 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.903225 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.914793 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.932963 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.943521 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.953401 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.965549 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.966245 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.966271 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.966280 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.966296 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.966306 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:00Z","lastTransitionTime":"2025-10-03T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.978086 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.987908 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:00 crc kubenswrapper[4636]: I1003 14:02:00.998113 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:00Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.012015 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:01Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.068450 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.068508 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.068516 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.068531 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.068539 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:01Z","lastTransitionTime":"2025-10-03T14:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.170380 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.170416 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.170442 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.170459 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.170471 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:01Z","lastTransitionTime":"2025-10-03T14:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.272700 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.272733 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.272744 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.272760 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.272771 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:01Z","lastTransitionTime":"2025-10-03T14:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.375215 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.375249 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.375260 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.375276 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.375286 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:01Z","lastTransitionTime":"2025-10-03T14:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.477756 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.477785 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.477794 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.477808 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.477817 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:01Z","lastTransitionTime":"2025-10-03T14:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.580500 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.580552 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.580565 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.580583 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.580596 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:01Z","lastTransitionTime":"2025-10-03T14:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.682818 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.682877 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.682888 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.682908 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.682921 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:01Z","lastTransitionTime":"2025-10-03T14:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.785472 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.785506 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.785515 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.785529 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.785538 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:01Z","lastTransitionTime":"2025-10-03T14:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.887877 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.887918 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.887932 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.887948 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.887959 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:01Z","lastTransitionTime":"2025-10-03T14:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.993742 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.993774 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.993783 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.993796 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:01 crc kubenswrapper[4636]: I1003 14:02:01.993805 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:01Z","lastTransitionTime":"2025-10-03T14:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.095980 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.096020 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.096030 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.096044 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.096055 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:02Z","lastTransitionTime":"2025-10-03T14:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.197740 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.197775 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.197784 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.197798 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.197808 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:02Z","lastTransitionTime":"2025-10-03T14:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.300553 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.300593 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.300604 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.300624 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.300634 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:02Z","lastTransitionTime":"2025-10-03T14:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.402759 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.402814 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.402822 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.402837 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.402845 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:02Z","lastTransitionTime":"2025-10-03T14:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.504837 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.504885 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.504897 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.504915 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.504928 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:02Z","lastTransitionTime":"2025-10-03T14:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.606908 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.606950 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.606960 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.606975 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.606984 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:02Z","lastTransitionTime":"2025-10-03T14:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.709163 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.709200 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.709211 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.709228 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.709239 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:02Z","lastTransitionTime":"2025-10-03T14:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.793532 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.793593 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.793645 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:02 crc kubenswrapper[4636]: E1003 14:02:02.793679 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.793631 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:02 crc kubenswrapper[4636]: E1003 14:02:02.793852 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:02 crc kubenswrapper[4636]: E1003 14:02:02.793932 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:02 crc kubenswrapper[4636]: E1003 14:02:02.794023 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.811611 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.811677 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.811690 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.811708 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.811719 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:02Z","lastTransitionTime":"2025-10-03T14:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.914035 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.914215 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.914248 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.914268 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:02 crc kubenswrapper[4636]: I1003 14:02:02.914279 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:02Z","lastTransitionTime":"2025-10-03T14:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.016463 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.016492 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.016500 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.016533 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.016542 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:03Z","lastTransitionTime":"2025-10-03T14:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.118913 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.118954 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.118963 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.118978 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.118987 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:03Z","lastTransitionTime":"2025-10-03T14:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.220783 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.220823 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.220834 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.220849 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.220859 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:03Z","lastTransitionTime":"2025-10-03T14:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.322757 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.322794 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.322805 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.322822 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.322834 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:03Z","lastTransitionTime":"2025-10-03T14:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.424853 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.424895 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.424906 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.424921 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.424931 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:03Z","lastTransitionTime":"2025-10-03T14:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.527080 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.527142 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.527156 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.527171 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.527184 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:03Z","lastTransitionTime":"2025-10-03T14:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.629328 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.629387 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.629398 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.629414 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.629425 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:03Z","lastTransitionTime":"2025-10-03T14:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.731967 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.732016 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.732028 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.732048 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.732061 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:03Z","lastTransitionTime":"2025-10-03T14:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.834281 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.834321 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.834332 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.834348 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.834358 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:03Z","lastTransitionTime":"2025-10-03T14:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.936599 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.936637 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.936645 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.936658 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:03 crc kubenswrapper[4636]: I1003 14:02:03.936699 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:03Z","lastTransitionTime":"2025-10-03T14:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.039336 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.039381 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.039392 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.039411 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.039422 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:04Z","lastTransitionTime":"2025-10-03T14:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.141838 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.141871 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.141880 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.141893 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.141904 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:04Z","lastTransitionTime":"2025-10-03T14:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.181311 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltsq6_140a698f-2661-4dc8-86d9-929b0d6dd326/kube-multus/0.log" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.181381 4636 generic.go:334] "Generic (PLEG): container finished" podID="140a698f-2661-4dc8-86d9-929b0d6dd326" containerID="45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57" exitCode=1 Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.181410 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltsq6" event={"ID":"140a698f-2661-4dc8-86d9-929b0d6dd326","Type":"ContainerDied","Data":"45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57"} Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.181808 4636 scope.go:117] "RemoveContainer" containerID="45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.201460 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.214637 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.230271 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.244534 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.244751 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.244827 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.244906 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.244976 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:04Z","lastTransitionTime":"2025-10-03T14:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.245510 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.257336 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.271587 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.284886 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.298755 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.313061 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.325779 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.338924 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.347378 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.347411 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.347419 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.347433 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.347442 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:04Z","lastTransitionTime":"2025-10-03T14:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.358599 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:42Z\\\",\\\"message\\\":\\\"kube-controller-manager-crc openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-t7xd5 openshift-multus/multus-additional-cni-plugins-lbt25 openshift-image-registry/node-ca-xf7xs openshift-multus/network-metrics-daemon-vm9z7 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-dns/node-resolver-r9xm2]\\\\nI1003 14:01:42.006517 6219 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 14:01:42.006530 6219 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006537 6219 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006543 6219 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-r9xm2 in node crc\\\\nI1003 14:01:42.006547 6219 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-r9xm2 after 0 failed attempt(s)\\\\nI1003 14:01:42.006552 6219 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006565 6219 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:01:42.006623 6219 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.369605 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.382731 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:03Z\\\",\\\"message\\\":\\\"2025-10-03T14:01:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1\\\\n2025-10-03T14:01:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1 to /host/opt/cni/bin/\\\\n2025-10-03T14:01:18Z [verbose] multus-daemon started\\\\n2025-10-03T14:01:18Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:02:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.395564 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.408575 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.423670 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:04Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.450524 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.450569 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.450581 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.450597 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.450607 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:04Z","lastTransitionTime":"2025-10-03T14:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.552569 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.552607 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.552618 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.552634 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.552645 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:04Z","lastTransitionTime":"2025-10-03T14:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.657824 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.658360 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.658499 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.658595 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.658681 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:04Z","lastTransitionTime":"2025-10-03T14:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.760756 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.761050 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.761155 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.761235 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.761298 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:04Z","lastTransitionTime":"2025-10-03T14:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.793719 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.793829 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.793722 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:04 crc kubenswrapper[4636]: E1003 14:02:04.793862 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:04 crc kubenswrapper[4636]: E1003 14:02:04.793900 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.793924 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:04 crc kubenswrapper[4636]: E1003 14:02:04.793966 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:04 crc kubenswrapper[4636]: E1003 14:02:04.794023 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.863194 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.863239 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.863250 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.863264 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.863274 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:04Z","lastTransitionTime":"2025-10-03T14:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.965723 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.966300 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.966422 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.966512 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:04 crc kubenswrapper[4636]: I1003 14:02:04.966595 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:04Z","lastTransitionTime":"2025-10-03T14:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.070589 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.070735 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.070764 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.070796 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.070818 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:05Z","lastTransitionTime":"2025-10-03T14:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.173588 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.173616 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.173624 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.173638 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.173647 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:05Z","lastTransitionTime":"2025-10-03T14:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.184889 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltsq6_140a698f-2661-4dc8-86d9-929b0d6dd326/kube-multus/0.log" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.184930 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltsq6" event={"ID":"140a698f-2661-4dc8-86d9-929b0d6dd326","Type":"ContainerStarted","Data":"f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85"} Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.199650 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.212022 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.233222 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:42Z\\\",\\\"message\\\":\\\"kube-controller-manager-crc openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-t7xd5 openshift-multus/multus-additional-cni-plugins-lbt25 openshift-image-registry/node-ca-xf7xs openshift-multus/network-metrics-daemon-vm9z7 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-dns/node-resolver-r9xm2]\\\\nI1003 14:01:42.006517 6219 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 14:01:42.006530 6219 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006537 6219 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006543 6219 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-r9xm2 in node crc\\\\nI1003 14:01:42.006547 6219 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-r9xm2 after 0 failed attempt(s)\\\\nI1003 14:01:42.006552 6219 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006565 6219 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:01:42.006623 6219 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.243716 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.259074 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.270601 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.275851 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.275890 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.275902 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.275923 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.275937 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:05Z","lastTransitionTime":"2025-10-03T14:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.288835 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:03Z\\\",\\\"message\\\":\\\"2025-10-03T14:01:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1\\\\n2025-10-03T14:01:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1 to /host/opt/cni/bin/\\\\n2025-10-03T14:01:18Z [verbose] multus-daemon started\\\\n2025-10-03T14:01:18Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:02:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.301726 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.317864 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.333441 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.346239 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.361142 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.373069 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.378121 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.378154 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.378173 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.378192 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.378205 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:05Z","lastTransitionTime":"2025-10-03T14:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.386523 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.399036 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.411550 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.423176 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:05Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.481005 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.481032 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.481041 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.481057 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.481067 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:05Z","lastTransitionTime":"2025-10-03T14:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.582912 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.582949 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.582962 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.582978 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.582988 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:05Z","lastTransitionTime":"2025-10-03T14:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.685456 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.685495 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.685506 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.685522 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.685531 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:05Z","lastTransitionTime":"2025-10-03T14:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.788192 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.788229 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.788239 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.788255 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.788265 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:05Z","lastTransitionTime":"2025-10-03T14:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.889710 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.889755 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.889767 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.889783 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.889795 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:05Z","lastTransitionTime":"2025-10-03T14:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.992046 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.992121 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.992133 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.992152 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:05 crc kubenswrapper[4636]: I1003 14:02:05.992165 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:05Z","lastTransitionTime":"2025-10-03T14:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.094236 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.094279 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.094307 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.094324 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.094333 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:06Z","lastTransitionTime":"2025-10-03T14:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.196533 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.196582 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.196600 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.196619 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.196632 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:06Z","lastTransitionTime":"2025-10-03T14:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.299254 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.299285 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.299293 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.299306 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.299316 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:06Z","lastTransitionTime":"2025-10-03T14:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.401960 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.401994 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.402004 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.402019 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.402029 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:06Z","lastTransitionTime":"2025-10-03T14:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.504460 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.504497 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.504506 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.504522 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.504532 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:06Z","lastTransitionTime":"2025-10-03T14:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.606447 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.606483 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.606491 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.606504 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.606514 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:06Z","lastTransitionTime":"2025-10-03T14:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.709033 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.709065 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.709077 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.709110 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.709122 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:06Z","lastTransitionTime":"2025-10-03T14:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.793983 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.794022 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.794078 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.794145 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:06 crc kubenswrapper[4636]: E1003 14:02:06.794153 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:06 crc kubenswrapper[4636]: E1003 14:02:06.794233 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:06 crc kubenswrapper[4636]: E1003 14:02:06.794303 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:06 crc kubenswrapper[4636]: E1003 14:02:06.794344 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.811193 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.811242 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.811253 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.811270 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.811281 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:06Z","lastTransitionTime":"2025-10-03T14:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.913911 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.913956 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.913968 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.913986 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:06 crc kubenswrapper[4636]: I1003 14:02:06.913999 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:06Z","lastTransitionTime":"2025-10-03T14:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.016147 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.016182 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.016194 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.016210 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.016221 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:07Z","lastTransitionTime":"2025-10-03T14:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.118695 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.118724 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.118732 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.118745 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.118753 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:07Z","lastTransitionTime":"2025-10-03T14:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.221481 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.221526 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.221537 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.221556 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.221567 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:07Z","lastTransitionTime":"2025-10-03T14:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.325401 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.325675 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.325756 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.325836 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.325900 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:07Z","lastTransitionTime":"2025-10-03T14:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.428221 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.428275 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.428289 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.428304 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.428314 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:07Z","lastTransitionTime":"2025-10-03T14:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.530915 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.531228 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.531352 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.531458 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.531565 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:07Z","lastTransitionTime":"2025-10-03T14:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.633656 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.633724 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.633742 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.633777 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.633794 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:07Z","lastTransitionTime":"2025-10-03T14:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.736271 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.736311 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.736322 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.736366 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.736381 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:07Z","lastTransitionTime":"2025-10-03T14:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.839321 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.839372 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.839390 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.839416 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.839435 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:07Z","lastTransitionTime":"2025-10-03T14:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.942132 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.942244 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.942258 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.942273 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:07 crc kubenswrapper[4636]: I1003 14:02:07.942284 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:07Z","lastTransitionTime":"2025-10-03T14:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.045575 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.045613 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.045624 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.045639 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.045649 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.148913 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.148959 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.148970 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.148988 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.149003 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.252600 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.252665 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.252687 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.252718 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.252743 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.356212 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.356249 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.356262 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.356352 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.356365 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.459595 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.459658 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.459681 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.459710 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.459734 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.562756 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.562806 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.562825 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.562849 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.562866 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.665763 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.666217 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.666398 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.666590 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.666719 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.769924 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.769974 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.769985 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.770003 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.770015 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.793531 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.793599 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:08 crc kubenswrapper[4636]: E1003 14:02:08.793670 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:08 crc kubenswrapper[4636]: E1003 14:02:08.793762 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.793837 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:08 crc kubenswrapper[4636]: E1003 14:02:08.793911 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.794134 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:08 crc kubenswrapper[4636]: E1003 14:02:08.794246 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.849913 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.849969 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.849981 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.850000 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.850012 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: E1003 14:02:08.867183 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:08Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.871275 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.871307 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.871321 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.871342 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.871352 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: E1003 14:02:08.889647 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:08Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.894630 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.894703 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.894723 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.894747 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.894766 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: E1003 14:02:08.912925 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:08Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.918141 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.918182 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.918194 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.918217 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.918227 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: E1003 14:02:08.935288 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:08Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.939320 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.939381 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.939398 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.939420 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.939435 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:08 crc kubenswrapper[4636]: E1003 14:02:08.951862 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:08Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:08 crc kubenswrapper[4636]: E1003 14:02:08.951977 4636 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.953543 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.953574 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.953583 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.953599 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:08 crc kubenswrapper[4636]: I1003 14:02:08.953612 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:08Z","lastTransitionTime":"2025-10-03T14:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.056381 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.056421 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.056429 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.056443 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.056452 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:09Z","lastTransitionTime":"2025-10-03T14:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.158944 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.158974 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.158983 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.158996 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.159005 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:09Z","lastTransitionTime":"2025-10-03T14:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.261529 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.261560 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.261568 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.261582 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.261591 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:09Z","lastTransitionTime":"2025-10-03T14:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.363520 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.363550 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.363558 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.363571 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.363581 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:09Z","lastTransitionTime":"2025-10-03T14:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.466478 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.466543 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.466555 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.466593 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.466605 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:09Z","lastTransitionTime":"2025-10-03T14:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.569219 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.569262 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.569274 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.569297 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.569308 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:09Z","lastTransitionTime":"2025-10-03T14:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.671817 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.671875 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.671890 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.671907 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.671918 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:09Z","lastTransitionTime":"2025-10-03T14:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.774469 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.774516 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.774533 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.774564 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.774581 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:09Z","lastTransitionTime":"2025-10-03T14:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.877077 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.877141 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.877152 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.877167 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.877178 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:09Z","lastTransitionTime":"2025-10-03T14:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.979771 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.979819 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.979833 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.979851 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:09 crc kubenswrapper[4636]: I1003 14:02:09.979864 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:09Z","lastTransitionTime":"2025-10-03T14:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.081894 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.081941 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.081951 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.081968 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.081977 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:10Z","lastTransitionTime":"2025-10-03T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.184459 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.184507 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.184520 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.184538 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.184549 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:10Z","lastTransitionTime":"2025-10-03T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.286673 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.286750 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.286773 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.286803 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.286823 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:10Z","lastTransitionTime":"2025-10-03T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.388834 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.388879 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.388895 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.388915 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.388929 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:10Z","lastTransitionTime":"2025-10-03T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.493004 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.493042 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.493051 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.493068 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.493078 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:10Z","lastTransitionTime":"2025-10-03T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.595758 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.595810 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.595822 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.595840 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.595854 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:10Z","lastTransitionTime":"2025-10-03T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.698028 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.698078 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.698093 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.698133 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.698144 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:10Z","lastTransitionTime":"2025-10-03T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.793259 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:10 crc kubenswrapper[4636]: E1003 14:02:10.793466 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.793789 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.793831 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.793802 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:10 crc kubenswrapper[4636]: E1003 14:02:10.793913 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:10 crc kubenswrapper[4636]: E1003 14:02:10.794013 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:10 crc kubenswrapper[4636]: E1003 14:02:10.794141 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.799788 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.799818 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.799827 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.799844 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.799856 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:10Z","lastTransitionTime":"2025-10-03T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.809030 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.823116 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.835418 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.846637 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.855289 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.864855 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.883287 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.900370 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.901919 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.902243 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.902403 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.902500 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.902596 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:10Z","lastTransitionTime":"2025-10-03T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.911555 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.926005 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.942532 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.960319 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:42Z\\\",\\\"message\\\":\\\"kube-controller-manager-crc openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-t7xd5 openshift-multus/multus-additional-cni-plugins-lbt25 openshift-image-registry/node-ca-xf7xs openshift-multus/network-metrics-daemon-vm9z7 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-dns/node-resolver-r9xm2]\\\\nI1003 14:01:42.006517 6219 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 14:01:42.006530 6219 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006537 6219 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006543 6219 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-r9xm2 in node crc\\\\nI1003 14:01:42.006547 6219 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-r9xm2 after 0 failed attempt(s)\\\\nI1003 14:01:42.006552 6219 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006565 6219 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:01:42.006623 6219 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.971617 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.985544 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:10 crc kubenswrapper[4636]: I1003 14:02:10.999005 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:10Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.005044 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.005082 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.005091 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.005126 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.005137 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:11Z","lastTransitionTime":"2025-10-03T14:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.012616 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:03Z\\\",\\\"message\\\":\\\"2025-10-03T14:01:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1\\\\n2025-10-03T14:01:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1 to /host/opt/cni/bin/\\\\n2025-10-03T14:01:18Z [verbose] multus-daemon started\\\\n2025-10-03T14:01:18Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:02:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.023059 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.075356 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.076244 4636 scope.go:117] "RemoveContainer" containerID="22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.106711 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.106737 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.106745 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.106759 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.106769 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:11Z","lastTransitionTime":"2025-10-03T14:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.205253 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/2.log" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.207807 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef"} Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.208650 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.211771 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.211795 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.211803 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.211819 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.211828 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:11Z","lastTransitionTime":"2025-10-03T14:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.222282 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.239225 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.250371 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.266035 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.280636 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.299927 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:42Z\\\",\\\"message\\\":\\\"kube-controller-manager-crc openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-t7xd5 openshift-multus/multus-additional-cni-plugins-lbt25 openshift-image-registry/node-ca-xf7xs openshift-multus/network-metrics-daemon-vm9z7 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-dns/node-resolver-r9xm2]\\\\nI1003 14:01:42.006517 6219 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 14:01:42.006530 6219 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006537 6219 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006543 6219 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-r9xm2 in node crc\\\\nI1003 14:01:42.006547 6219 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-r9xm2 after 0 failed attempt(s)\\\\nI1003 14:01:42.006552 6219 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006565 6219 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:01:42.006623 6219 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.312609 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.314412 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.314468 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.314481 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.314500 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.314514 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:11Z","lastTransitionTime":"2025-10-03T14:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.327112 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.340123 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.353264 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:03Z\\\",\\\"message\\\":\\\"2025-10-03T14:01:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1\\\\n2025-10-03T14:01:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1 to /host/opt/cni/bin/\\\\n2025-10-03T14:01:18Z [verbose] multus-daemon started\\\\n2025-10-03T14:01:18Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:02:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.366254 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.380260 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.392251 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.408944 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.416351 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.416423 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.416437 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.416455 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.416465 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:11Z","lastTransitionTime":"2025-10-03T14:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.427487 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.448735 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.463740 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:11Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.519220 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.519263 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.519282 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.519299 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.519308 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:11Z","lastTransitionTime":"2025-10-03T14:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.621839 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.621884 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.621893 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.621907 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.621916 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:11Z","lastTransitionTime":"2025-10-03T14:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.724675 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.724716 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.724727 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.724745 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.724758 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:11Z","lastTransitionTime":"2025-10-03T14:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.827040 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.827082 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.827109 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.827131 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.827143 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:11Z","lastTransitionTime":"2025-10-03T14:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.929963 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.930014 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.930024 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.930045 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:11 crc kubenswrapper[4636]: I1003 14:02:11.930055 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:11Z","lastTransitionTime":"2025-10-03T14:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.032961 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.033024 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.033036 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.033050 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.033060 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:12Z","lastTransitionTime":"2025-10-03T14:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.135056 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.135116 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.135126 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.135142 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.135153 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:12Z","lastTransitionTime":"2025-10-03T14:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.213247 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/3.log" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.213767 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/2.log" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.216658 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" exitCode=1 Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.216730 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef"} Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.216821 4636 scope.go:117] "RemoveContainer" containerID="22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.218047 4636 scope.go:117] "RemoveContainer" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" Oct 03 14:02:12 crc kubenswrapper[4636]: E1003 14:02:12.218352 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.239595 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.239639 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.239650 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.239667 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.239679 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:12Z","lastTransitionTime":"2025-10-03T14:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.245532 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.276625 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.294862 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.309476 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.323358 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.347767 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.347796 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.347805 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.347818 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.347827 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:12Z","lastTransitionTime":"2025-10-03T14:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.350701 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d36197af6590868a0e89840e2e6e0fa753066ca239bc3bb1e2474ba3d1f228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:01:42Z\\\",\\\"message\\\":\\\"kube-controller-manager-crc openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-t7xd5 openshift-multus/multus-additional-cni-plugins-lbt25 openshift-image-registry/node-ca-xf7xs openshift-multus/network-metrics-daemon-vm9z7 openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-dns/node-resolver-r9xm2]\\\\nI1003 14:01:42.006517 6219 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 14:01:42.006530 6219 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006537 6219 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006543 6219 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-r9xm2 in node crc\\\\nI1003 14:01:42.006547 6219 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-r9xm2 after 0 failed attempt(s)\\\\nI1003 14:01:42.006552 6219 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-r9xm2\\\\nI1003 14:01:42.006565 6219 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:01:42.006623 6219 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:11Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 28.926067ms\\\\nI1003 14:02:11.918695 6573 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:02:11.918762 6573 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:02:11.919169 6573 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:02:11.919255 6573 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 14:02:11.919293 6573 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 14:02:11.919384 6573 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 14:02:11.919400 6573 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 14:02:11.919553 6573 factory.go:656] Stopping watch factory\\\\nI1003 14:02:11.919586 6573 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 14:02:11.919280 6573 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:02:11.919658 6573 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 14:02:11.919672 6573 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 14:02:11.919692 6573 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 14:02:11.924322 6573 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:02:11.924380 6573 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:02:11.924470 6573 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.363065 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.377877 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.390135 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.408427 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:03Z\\\",\\\"message\\\":\\\"2025-10-03T14:01:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1\\\\n2025-10-03T14:01:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1 to /host/opt/cni/bin/\\\\n2025-10-03T14:01:18Z [verbose] multus-daemon started\\\\n2025-10-03T14:01:18Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:02:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.424557 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.438199 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.450362 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.450406 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.450418 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.450436 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.450446 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:12Z","lastTransitionTime":"2025-10-03T14:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.456619 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.473000 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.484576 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.496345 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.506300 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:12Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.553388 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.553420 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.553428 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.553441 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.553451 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:12Z","lastTransitionTime":"2025-10-03T14:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.656452 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.656504 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.656520 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.656545 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.656562 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:12Z","lastTransitionTime":"2025-10-03T14:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.758868 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.758921 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.758932 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.758947 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.758958 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:12Z","lastTransitionTime":"2025-10-03T14:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.793633 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:12 crc kubenswrapper[4636]: E1003 14:02:12.793823 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.794214 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:12 crc kubenswrapper[4636]: E1003 14:02:12.794318 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.794622 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:12 crc kubenswrapper[4636]: E1003 14:02:12.794738 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.794935 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:12 crc kubenswrapper[4636]: E1003 14:02:12.795053 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.862065 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.862133 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.862146 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.862162 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.862174 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:12Z","lastTransitionTime":"2025-10-03T14:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.964147 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.964204 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.964217 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.964239 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:12 crc kubenswrapper[4636]: I1003 14:02:12.964255 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:12Z","lastTransitionTime":"2025-10-03T14:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.067016 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.067046 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.067056 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.067073 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.067085 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:13Z","lastTransitionTime":"2025-10-03T14:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.170416 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.170489 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.170503 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.170527 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.170542 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:13Z","lastTransitionTime":"2025-10-03T14:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.223588 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/3.log" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.228967 4636 scope.go:117] "RemoveContainer" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" Oct 03 14:02:13 crc kubenswrapper[4636]: E1003 14:02:13.229192 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.246020 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.264520 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.276925 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.276975 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.276984 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.277003 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.277015 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:13Z","lastTransitionTime":"2025-10-03T14:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.280821 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.303002 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.318510 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.336810 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.353714 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.369511 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.379537 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.379568 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.379578 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.379602 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.379619 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:13Z","lastTransitionTime":"2025-10-03T14:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.381786 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.392774 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.404768 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.417776 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.438398 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:11Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 28.926067ms\\\\nI1003 14:02:11.918695 6573 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:02:11.918762 6573 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:02:11.919169 6573 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:02:11.919255 6573 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 14:02:11.919293 6573 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 14:02:11.919384 6573 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 14:02:11.919400 6573 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 14:02:11.919553 6573 factory.go:656] Stopping watch factory\\\\nI1003 14:02:11.919586 6573 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 14:02:11.919280 6573 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:02:11.919658 6573 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 14:02:11.919672 6573 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 14:02:11.919692 6573 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 14:02:11.924322 6573 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:02:11.924380 6573 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:02:11.924470 6573 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:02:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.452343 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.465684 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:03Z\\\",\\\"message\\\":\\\"2025-10-03T14:01:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1\\\\n2025-10-03T14:01:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1 to /host/opt/cni/bin/\\\\n2025-10-03T14:01:18Z [verbose] multus-daemon started\\\\n2025-10-03T14:01:18Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:02:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.477528 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.481176 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.481213 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.481225 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.481242 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.481255 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:13Z","lastTransitionTime":"2025-10-03T14:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.492087 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:13Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.583928 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.583961 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.583972 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.583988 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.583999 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:13Z","lastTransitionTime":"2025-10-03T14:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.687070 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.687147 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.687160 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.687201 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.687222 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:13Z","lastTransitionTime":"2025-10-03T14:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.790204 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.790260 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.790278 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.790303 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.790319 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:13Z","lastTransitionTime":"2025-10-03T14:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.893987 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.894054 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.894080 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.894153 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.894181 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:13Z","lastTransitionTime":"2025-10-03T14:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.997164 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.997209 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.997221 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.997238 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:13 crc kubenswrapper[4636]: I1003 14:02:13.997250 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:13Z","lastTransitionTime":"2025-10-03T14:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.100959 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.101048 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.101071 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.101162 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.101186 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:14Z","lastTransitionTime":"2025-10-03T14:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.203870 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.203939 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.203958 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.203986 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.204006 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:14Z","lastTransitionTime":"2025-10-03T14:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.307187 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.307244 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.307258 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.307279 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.307293 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:14Z","lastTransitionTime":"2025-10-03T14:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.409984 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.410055 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.410065 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.410112 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.410128 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:14Z","lastTransitionTime":"2025-10-03T14:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.514534 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.514610 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.514631 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.514662 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.514686 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:14Z","lastTransitionTime":"2025-10-03T14:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.617649 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.617762 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.617794 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.617833 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.617857 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:14Z","lastTransitionTime":"2025-10-03T14:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.720617 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.720666 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.720677 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.720695 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.720707 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:14Z","lastTransitionTime":"2025-10-03T14:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.793301 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.793384 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.793324 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:14 crc kubenswrapper[4636]: E1003 14:02:14.793528 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.793562 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:14 crc kubenswrapper[4636]: E1003 14:02:14.793694 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:14 crc kubenswrapper[4636]: E1003 14:02:14.793878 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:14 crc kubenswrapper[4636]: E1003 14:02:14.793980 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.824080 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.824161 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.824180 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.824204 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.824222 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:14Z","lastTransitionTime":"2025-10-03T14:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.927147 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.927195 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.927207 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.927228 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:14 crc kubenswrapper[4636]: I1003 14:02:14.927240 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:14Z","lastTransitionTime":"2025-10-03T14:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.029647 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.029701 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.029714 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.029735 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.029745 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:15Z","lastTransitionTime":"2025-10-03T14:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.132438 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.132485 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.132502 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.132523 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.132539 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:15Z","lastTransitionTime":"2025-10-03T14:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.235089 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.235163 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.235173 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.235195 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.235207 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:15Z","lastTransitionTime":"2025-10-03T14:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.339505 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.339573 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.339597 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.339680 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.339715 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:15Z","lastTransitionTime":"2025-10-03T14:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.442334 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.442368 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.442377 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.442391 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.442400 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:15Z","lastTransitionTime":"2025-10-03T14:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.545153 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.545190 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.545201 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.545217 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.545227 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:15Z","lastTransitionTime":"2025-10-03T14:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.646864 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.647063 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.647046286 +0000 UTC m=+149.505772533 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.647926 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.647972 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.647986 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.648002 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.648011 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:15Z","lastTransitionTime":"2025-10-03T14:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.748125 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.748188 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.748217 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.748240 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748320 4636 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748389 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.7483608 +0000 UTC m=+149.607087047 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748492 4636 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748566 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748592 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.748573565 +0000 UTC m=+149.607299812 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748598 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748614 4636 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748630 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748708 4636 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748739 4636 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748677 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.748657257 +0000 UTC m=+149.607383584 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:02:15 crc kubenswrapper[4636]: E1003 14:02:15.748850 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.748812141 +0000 UTC m=+149.607538428 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.750268 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.750312 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.750326 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.750345 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.750358 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:15Z","lastTransitionTime":"2025-10-03T14:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.853193 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.853237 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.853245 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.853260 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.853269 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:15Z","lastTransitionTime":"2025-10-03T14:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.956580 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.956629 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.956638 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.956660 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:15 crc kubenswrapper[4636]: I1003 14:02:15.956674 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:15Z","lastTransitionTime":"2025-10-03T14:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.058882 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.058924 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.058934 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.058950 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.058960 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:16Z","lastTransitionTime":"2025-10-03T14:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.161815 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.161855 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.161864 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.161878 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.161889 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:16Z","lastTransitionTime":"2025-10-03T14:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.264493 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.264569 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.264588 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.264618 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.264638 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:16Z","lastTransitionTime":"2025-10-03T14:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.368358 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.368408 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.368420 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.368437 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.368448 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:16Z","lastTransitionTime":"2025-10-03T14:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.471997 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.472077 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.472091 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.472130 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.472147 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:16Z","lastTransitionTime":"2025-10-03T14:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.575126 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.575191 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.575206 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.575234 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.575253 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:16Z","lastTransitionTime":"2025-10-03T14:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.679007 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.679072 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.679085 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.679159 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.679176 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:16Z","lastTransitionTime":"2025-10-03T14:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.782360 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.782406 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.782419 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.782439 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.782452 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:16Z","lastTransitionTime":"2025-10-03T14:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.793736 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.793836 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:16 crc kubenswrapper[4636]: E1003 14:02:16.793885 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.793974 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:16 crc kubenswrapper[4636]: E1003 14:02:16.794073 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.794092 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:16 crc kubenswrapper[4636]: E1003 14:02:16.794329 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:16 crc kubenswrapper[4636]: E1003 14:02:16.794574 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.806021 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.885893 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.885932 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.885942 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.885958 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.885971 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:16Z","lastTransitionTime":"2025-10-03T14:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.988663 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.988712 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.988724 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.988948 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:16 crc kubenswrapper[4636]: I1003 14:02:16.988968 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:16Z","lastTransitionTime":"2025-10-03T14:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.092041 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.092089 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.092118 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.092137 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.092149 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:17Z","lastTransitionTime":"2025-10-03T14:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.194860 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.194906 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.194916 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.194930 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.194939 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:17Z","lastTransitionTime":"2025-10-03T14:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.296975 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.297012 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.297023 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.297037 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.297046 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:17Z","lastTransitionTime":"2025-10-03T14:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.399704 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.399743 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.399755 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.399770 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.399783 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:17Z","lastTransitionTime":"2025-10-03T14:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.501694 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.501731 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.501741 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.501755 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.501764 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:17Z","lastTransitionTime":"2025-10-03T14:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.604211 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.604240 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.604249 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.604265 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.604276 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:17Z","lastTransitionTime":"2025-10-03T14:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.706602 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.706644 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.706653 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.706667 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.706676 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:17Z","lastTransitionTime":"2025-10-03T14:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.809466 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.809721 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.809788 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.809863 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.809930 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:17Z","lastTransitionTime":"2025-10-03T14:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.911690 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.911952 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.912034 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.912128 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:17 crc kubenswrapper[4636]: I1003 14:02:17.912200 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:17Z","lastTransitionTime":"2025-10-03T14:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.014815 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.014848 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.014863 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.014877 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.014886 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:18Z","lastTransitionTime":"2025-10-03T14:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.117567 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.117611 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.117621 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.117637 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.117646 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:18Z","lastTransitionTime":"2025-10-03T14:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.219473 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.219521 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.219530 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.219546 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.219560 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:18Z","lastTransitionTime":"2025-10-03T14:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.322569 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.322606 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.322616 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.322632 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.322642 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:18Z","lastTransitionTime":"2025-10-03T14:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.424869 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.424919 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.424928 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.424945 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.424955 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:18Z","lastTransitionTime":"2025-10-03T14:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.527370 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.527419 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.527433 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.527451 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.527462 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:18Z","lastTransitionTime":"2025-10-03T14:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.629601 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.629637 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.629650 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.629665 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.629676 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:18Z","lastTransitionTime":"2025-10-03T14:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.731471 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.731723 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.731855 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.731957 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.732023 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:18Z","lastTransitionTime":"2025-10-03T14:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.793290 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:18 crc kubenswrapper[4636]: E1003 14:02:18.793452 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.793511 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.793556 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:18 crc kubenswrapper[4636]: E1003 14:02:18.793659 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.793749 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:18 crc kubenswrapper[4636]: E1003 14:02:18.793855 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:18 crc kubenswrapper[4636]: E1003 14:02:18.793978 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.833901 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.833942 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.833952 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.833968 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.833979 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:18Z","lastTransitionTime":"2025-10-03T14:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.935783 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.935816 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.935827 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.935842 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:18 crc kubenswrapper[4636]: I1003 14:02:18.935853 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:18Z","lastTransitionTime":"2025-10-03T14:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.038233 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.038277 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.038286 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.038302 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.038312 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.052738 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.052795 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.052807 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.052823 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.052832 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: E1003 14:02:19.064806 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.068493 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.068531 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.068566 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.068602 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.068615 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: E1003 14:02:19.080157 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.083472 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.083513 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.083522 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.083537 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.083547 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: E1003 14:02:19.095524 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.098686 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.098712 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.098720 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.098732 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.098742 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: E1003 14:02:19.109150 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.112074 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.112150 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.112166 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.112205 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.112218 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: E1003 14:02:19.123563 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:19Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:19 crc kubenswrapper[4636]: E1003 14:02:19.123792 4636 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.140727 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.140788 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.140798 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.140814 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.140825 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.243433 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.243468 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.243475 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.243492 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.243501 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.346475 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.346507 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.346515 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.346528 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.346538 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.448438 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.448481 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.448498 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.448515 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.448525 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.551026 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.551067 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.551077 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.551116 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.551130 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.654864 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.654894 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.654926 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.654944 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.654954 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.757000 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.757129 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.757146 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.757162 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.757170 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.860349 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.860392 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.860402 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.860417 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.860429 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.962910 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.962983 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.962993 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.963009 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:19 crc kubenswrapper[4636]: I1003 14:02:19.963021 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:19Z","lastTransitionTime":"2025-10-03T14:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.066203 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.066244 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.066252 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.066268 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.066279 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:20Z","lastTransitionTime":"2025-10-03T14:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.168122 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.168167 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.168179 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.168196 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.168210 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:20Z","lastTransitionTime":"2025-10-03T14:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.270800 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.270844 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.270853 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.270869 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.270880 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:20Z","lastTransitionTime":"2025-10-03T14:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.373361 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.373443 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.373467 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.373499 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.373522 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:20Z","lastTransitionTime":"2025-10-03T14:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.476733 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.476828 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.476851 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.476889 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.476912 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:20Z","lastTransitionTime":"2025-10-03T14:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.580285 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.580340 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.580351 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.580372 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.580385 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:20Z","lastTransitionTime":"2025-10-03T14:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.683087 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.683155 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.683166 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.683180 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.683190 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:20Z","lastTransitionTime":"2025-10-03T14:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.785242 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.785285 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.785294 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.785309 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.785320 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:20Z","lastTransitionTime":"2025-10-03T14:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.794406 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.794497 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:20 crc kubenswrapper[4636]: E1003 14:02:20.794539 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:20 crc kubenswrapper[4636]: E1003 14:02:20.794643 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.794876 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:20 crc kubenswrapper[4636]: E1003 14:02:20.794962 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.795036 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:20 crc kubenswrapper[4636]: E1003 14:02:20.795087 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.812499 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.831295 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:11Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 28.926067ms\\\\nI1003 14:02:11.918695 6573 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:02:11.918762 6573 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:02:11.919169 6573 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:02:11.919255 6573 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 14:02:11.919293 6573 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 14:02:11.919384 6573 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 14:02:11.919400 6573 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 14:02:11.919553 6573 factory.go:656] Stopping watch factory\\\\nI1003 14:02:11.919586 6573 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 14:02:11.919280 6573 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:02:11.919658 6573 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 14:02:11.919672 6573 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 14:02:11.919692 6573 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 14:02:11.924322 6573 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:02:11.924380 6573 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:02:11.924470 6573 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:02:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.844025 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.862660 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.880300 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.888137 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.888256 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.888268 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.888286 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.888297 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:20Z","lastTransitionTime":"2025-10-03T14:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.896727 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.910988 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:03Z\\\",\\\"message\\\":\\\"2025-10-03T14:01:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1\\\\n2025-10-03T14:01:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1 to /host/opt/cni/bin/\\\\n2025-10-03T14:01:18Z [verbose] multus-daemon started\\\\n2025-10-03T14:01:18Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:02:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.921397 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.930062 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed9d52d-9394-4f9f-b1e4-ee7b4481a530\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://449f08b53e055181f2144302dcff762922e28aaa605b9256e9c0e0d4b2027413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5de2406a1c7eb859a4433c77e351aeefe545517d1fe3bf914419b6db29a6a44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5de2406a1c7eb859a4433c77e351aeefe545517d1fe3bf914419b6db29a6a44c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.940114 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.951596 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.964845 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.977728 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.989952 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:20Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.992446 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.992471 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.992479 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.992493 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:20 crc kubenswrapper[4636]: I1003 14:02:20.992503 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:20Z","lastTransitionTime":"2025-10-03T14:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.003203 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.020990 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.034264 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.048530 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:21Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.094876 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.094935 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.094945 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.094963 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.094972 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:21Z","lastTransitionTime":"2025-10-03T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.198021 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.198084 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.198135 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.198163 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.198182 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:21Z","lastTransitionTime":"2025-10-03T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.300643 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.300714 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.300733 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.300763 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.300781 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:21Z","lastTransitionTime":"2025-10-03T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.403992 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.404037 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.404047 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.404065 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.404078 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:21Z","lastTransitionTime":"2025-10-03T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.507289 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.507353 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.507377 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.507421 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.507442 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:21Z","lastTransitionTime":"2025-10-03T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.610421 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.610520 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.610538 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.610578 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.610615 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:21Z","lastTransitionTime":"2025-10-03T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.713972 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.714018 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.714029 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.714046 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.714057 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:21Z","lastTransitionTime":"2025-10-03T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.816369 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.816410 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.816420 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.816437 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.816447 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:21Z","lastTransitionTime":"2025-10-03T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.817709 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.919184 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.919228 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.919238 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.919257 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:21 crc kubenswrapper[4636]: I1003 14:02:21.919268 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:21Z","lastTransitionTime":"2025-10-03T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.021294 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.021332 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.021342 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.021356 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.021366 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:22Z","lastTransitionTime":"2025-10-03T14:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.124043 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.124087 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.124124 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.124143 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.124156 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:22Z","lastTransitionTime":"2025-10-03T14:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.226295 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.226359 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.226376 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.226397 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.226412 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:22Z","lastTransitionTime":"2025-10-03T14:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.329035 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.329065 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.329073 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.329089 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.329115 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:22Z","lastTransitionTime":"2025-10-03T14:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.432219 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.432281 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.432295 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.432324 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.432343 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:22Z","lastTransitionTime":"2025-10-03T14:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.535324 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.535406 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.535430 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.535461 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.535483 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:22Z","lastTransitionTime":"2025-10-03T14:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.638914 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.638970 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.638985 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.639008 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.639024 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:22Z","lastTransitionTime":"2025-10-03T14:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.741831 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.741872 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.741884 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.741899 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.741909 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:22Z","lastTransitionTime":"2025-10-03T14:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.793588 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.793632 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.793654 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.793632 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:22 crc kubenswrapper[4636]: E1003 14:02:22.793739 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:22 crc kubenswrapper[4636]: E1003 14:02:22.793833 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:22 crc kubenswrapper[4636]: E1003 14:02:22.793876 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:22 crc kubenswrapper[4636]: E1003 14:02:22.793926 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.846429 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.846520 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.846550 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.846585 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.846608 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:22Z","lastTransitionTime":"2025-10-03T14:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.950004 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.950054 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.950065 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.950083 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:22 crc kubenswrapper[4636]: I1003 14:02:22.950094 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:22Z","lastTransitionTime":"2025-10-03T14:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.052932 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.052997 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.053014 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.053043 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.053059 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:23Z","lastTransitionTime":"2025-10-03T14:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.155746 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.155817 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.155843 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.155874 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.155892 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:23Z","lastTransitionTime":"2025-10-03T14:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.258577 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.258622 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.258634 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.258655 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.258667 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:23Z","lastTransitionTime":"2025-10-03T14:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.361985 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.362060 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.362080 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.362136 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.362156 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:23Z","lastTransitionTime":"2025-10-03T14:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.465680 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.465744 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.465756 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.465778 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.465791 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:23Z","lastTransitionTime":"2025-10-03T14:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.569230 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.569299 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.569317 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.569343 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.569361 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:23Z","lastTransitionTime":"2025-10-03T14:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.672495 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.672541 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.672554 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.672572 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.672584 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:23Z","lastTransitionTime":"2025-10-03T14:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.775606 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.775675 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.775689 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.775707 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.775721 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:23Z","lastTransitionTime":"2025-10-03T14:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.878738 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.878807 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.878824 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.878849 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.878866 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:23Z","lastTransitionTime":"2025-10-03T14:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.981431 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.981467 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.981478 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.981497 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:23 crc kubenswrapper[4636]: I1003 14:02:23.981511 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:23Z","lastTransitionTime":"2025-10-03T14:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.084322 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.084361 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.084372 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.084388 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.084399 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:24Z","lastTransitionTime":"2025-10-03T14:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.187064 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.187141 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.187157 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.187182 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.187198 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:24Z","lastTransitionTime":"2025-10-03T14:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.289858 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.289917 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.289927 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.289942 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.289954 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:24Z","lastTransitionTime":"2025-10-03T14:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.392736 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.392779 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.392788 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.392805 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.392814 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:24Z","lastTransitionTime":"2025-10-03T14:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.495291 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.495349 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.495364 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.495384 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.495396 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:24Z","lastTransitionTime":"2025-10-03T14:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.598616 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.598654 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.598664 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.598684 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.598694 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:24Z","lastTransitionTime":"2025-10-03T14:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.701437 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.701483 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.701494 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.701513 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.701525 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:24Z","lastTransitionTime":"2025-10-03T14:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.794332 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:24 crc kubenswrapper[4636]: E1003 14:02:24.794530 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.794815 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:24 crc kubenswrapper[4636]: E1003 14:02:24.794936 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.795206 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:24 crc kubenswrapper[4636]: E1003 14:02:24.795354 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.795773 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:24 crc kubenswrapper[4636]: E1003 14:02:24.795925 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.804306 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.804356 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.804373 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.804395 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.804412 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:24Z","lastTransitionTime":"2025-10-03T14:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.907326 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.907599 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.907704 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.907793 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:24 crc kubenswrapper[4636]: I1003 14:02:24.907875 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:24Z","lastTransitionTime":"2025-10-03T14:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.010469 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.010520 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.010535 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.010558 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.010573 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:25Z","lastTransitionTime":"2025-10-03T14:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.112762 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.113023 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.113107 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.113215 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.113315 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:25Z","lastTransitionTime":"2025-10-03T14:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.216673 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.216928 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.217011 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.217118 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.217186 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:25Z","lastTransitionTime":"2025-10-03T14:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.320295 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.320525 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.320587 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.320704 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.320775 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:25Z","lastTransitionTime":"2025-10-03T14:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.423131 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.423417 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.423484 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.423566 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.423667 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:25Z","lastTransitionTime":"2025-10-03T14:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.526582 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.526632 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.526641 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.526656 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.526665 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:25Z","lastTransitionTime":"2025-10-03T14:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.629611 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.630146 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.630237 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.630323 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.630383 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:25Z","lastTransitionTime":"2025-10-03T14:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.733477 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.733717 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.733851 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.733912 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.733969 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:25Z","lastTransitionTime":"2025-10-03T14:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.837186 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.837237 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.837249 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.837271 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.837296 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:25Z","lastTransitionTime":"2025-10-03T14:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.939976 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.940278 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.940409 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.940515 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:25 crc kubenswrapper[4636]: I1003 14:02:25.940601 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:25Z","lastTransitionTime":"2025-10-03T14:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.042818 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.042859 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.042870 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.042886 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.042898 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:26Z","lastTransitionTime":"2025-10-03T14:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.146513 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.146571 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.146584 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.146608 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.146623 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:26Z","lastTransitionTime":"2025-10-03T14:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.249288 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.249353 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.249366 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.249389 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.249407 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:26Z","lastTransitionTime":"2025-10-03T14:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.352268 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.352308 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.352320 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.352338 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.352350 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:26Z","lastTransitionTime":"2025-10-03T14:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.454941 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.454988 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.455001 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.455020 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.455032 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:26Z","lastTransitionTime":"2025-10-03T14:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.557833 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.558354 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.558626 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.558822 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.558982 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:26Z","lastTransitionTime":"2025-10-03T14:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.662489 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.663068 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.663196 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.663291 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.663362 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:26Z","lastTransitionTime":"2025-10-03T14:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.766036 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.766192 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.766232 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.766261 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.766288 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:26Z","lastTransitionTime":"2025-10-03T14:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.794489 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.794563 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.794575 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:26 crc kubenswrapper[4636]: E1003 14:02:26.794686 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:26 crc kubenswrapper[4636]: E1003 14:02:26.794925 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:26 crc kubenswrapper[4636]: E1003 14:02:26.795037 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.795211 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:26 crc kubenswrapper[4636]: E1003 14:02:26.795331 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.869678 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.870288 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.870461 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.870662 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.870793 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:26Z","lastTransitionTime":"2025-10-03T14:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.974439 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.974839 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.974913 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.974990 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:26 crc kubenswrapper[4636]: I1003 14:02:26.975077 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:26Z","lastTransitionTime":"2025-10-03T14:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.079224 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.079629 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.079710 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.079816 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.079982 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:27Z","lastTransitionTime":"2025-10-03T14:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.183219 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.183268 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.183278 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.183295 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.183337 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:27Z","lastTransitionTime":"2025-10-03T14:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.286504 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.286606 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.286670 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.286771 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.286800 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:27Z","lastTransitionTime":"2025-10-03T14:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.389722 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.389801 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.389823 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.389850 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.389869 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:27Z","lastTransitionTime":"2025-10-03T14:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.492530 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.492595 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.492605 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.492626 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.492640 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:27Z","lastTransitionTime":"2025-10-03T14:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.595125 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.595163 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.595171 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.595185 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.595197 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:27Z","lastTransitionTime":"2025-10-03T14:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.698060 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.698132 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.698145 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.698170 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.698183 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:27Z","lastTransitionTime":"2025-10-03T14:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.793829 4636 scope.go:117] "RemoveContainer" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" Oct 03 14:02:27 crc kubenswrapper[4636]: E1003 14:02:27.793998 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.800651 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.800682 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.800689 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.800702 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.800712 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:27Z","lastTransitionTime":"2025-10-03T14:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.903442 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.903508 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.903530 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.903568 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:27 crc kubenswrapper[4636]: I1003 14:02:27.903587 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:27Z","lastTransitionTime":"2025-10-03T14:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.006275 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.006307 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.006316 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.006330 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.006338 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:28Z","lastTransitionTime":"2025-10-03T14:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.109411 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.109471 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.109488 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.109514 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.109532 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:28Z","lastTransitionTime":"2025-10-03T14:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.212667 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.212977 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.213052 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.213172 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.213253 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:28Z","lastTransitionTime":"2025-10-03T14:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.316330 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.316606 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.316825 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.316962 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.317075 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:28Z","lastTransitionTime":"2025-10-03T14:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.419522 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.419807 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.419883 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.419967 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.420041 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:28Z","lastTransitionTime":"2025-10-03T14:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.523086 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.523438 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.523522 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.523605 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.523689 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:28Z","lastTransitionTime":"2025-10-03T14:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.626046 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.626303 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.626378 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.626469 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.626533 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:28Z","lastTransitionTime":"2025-10-03T14:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.729292 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.729844 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.729914 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.729987 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.730066 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:28Z","lastTransitionTime":"2025-10-03T14:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.793657 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.793678 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.793777 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.794258 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:28 crc kubenswrapper[4636]: E1003 14:02:28.794406 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:28 crc kubenswrapper[4636]: E1003 14:02:28.794666 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:28 crc kubenswrapper[4636]: E1003 14:02:28.794943 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:28 crc kubenswrapper[4636]: E1003 14:02:28.795139 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.833063 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.833380 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.833467 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.833569 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.833661 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:28Z","lastTransitionTime":"2025-10-03T14:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.935812 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.935858 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.935872 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.935895 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:28 crc kubenswrapper[4636]: I1003 14:02:28.935913 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:28Z","lastTransitionTime":"2025-10-03T14:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.037794 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.037837 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.037850 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.037867 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.037880 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.140712 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.140757 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.140765 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.140781 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.140790 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.186369 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.186421 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.186435 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.186454 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.186468 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: E1003 14:02:29.201279 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.204765 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.204805 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.204821 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.204843 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.204857 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: E1003 14:02:29.216859 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.221089 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.221155 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.221173 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.221198 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.221214 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: E1003 14:02:29.235187 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.238973 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.239008 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.239020 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.239037 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.239049 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: E1003 14:02:29.257914 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.262624 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.262686 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.262697 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.262716 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.262727 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: E1003 14:02:29.276809 4636 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9943c44-af0e-4d0e-8d9b-fbf9dab653b1\\\",\\\"systemUUID\\\":\\\"5822d918-3835-42d5-a2d8-0c9b2af0c4b1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:29Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:29 crc kubenswrapper[4636]: E1003 14:02:29.276966 4636 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.278894 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.278926 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.278938 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.278956 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.278968 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.380953 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.381014 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.381026 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.381047 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.381060 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.484493 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.484531 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.484539 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.484554 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.484566 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.587202 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.587231 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.587239 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.587252 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.587261 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.689819 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.689901 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.689913 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.689932 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.689944 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.800283 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.800378 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.800402 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.800438 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.800475 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.903793 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.903853 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.903874 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.903897 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:29 crc kubenswrapper[4636]: I1003 14:02:29.903914 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:29Z","lastTransitionTime":"2025-10-03T14:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.007508 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.007577 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.007596 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.007627 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.007648 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:30Z","lastTransitionTime":"2025-10-03T14:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.110758 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.110804 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.110817 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.110835 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.110848 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:30Z","lastTransitionTime":"2025-10-03T14:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.213688 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.213725 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.213735 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.213749 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.213759 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:30Z","lastTransitionTime":"2025-10-03T14:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.316280 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.316355 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.316365 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.316467 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.316479 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:30Z","lastTransitionTime":"2025-10-03T14:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.418885 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.418934 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.418946 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.418965 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.418979 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:30Z","lastTransitionTime":"2025-10-03T14:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.521729 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.521780 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.521791 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.521815 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.521827 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:30Z","lastTransitionTime":"2025-10-03T14:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.624479 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.624554 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.624583 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.624626 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.624649 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:30Z","lastTransitionTime":"2025-10-03T14:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.727036 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.727067 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.727075 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.727087 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.727115 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:30Z","lastTransitionTime":"2025-10-03T14:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.792735 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:30 crc kubenswrapper[4636]: E1003 14:02:30.793617 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.792815 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:30 crc kubenswrapper[4636]: E1003 14:02:30.793755 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.792791 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.792822 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:30 crc kubenswrapper[4636]: E1003 14:02:30.793833 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:30 crc kubenswrapper[4636]: E1003 14:02:30.793910 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.809966 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127e9241-93b8-44d4-b99d-6117dc0edfa7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85d8b73ecf3263c5fceaf8dd7d91636220c00bc39ffae7be47f18c9ced792aec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf31bf5f171ae6bb7dfe6ed48d22852c787576286274ed5a3dbd6574dfedb25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f4bfc6d038008c78e0e35c604f9dd0afac335b6127e43889ed335c27e1c2c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f185765f3ef723b58932469681a3cb9249e44073b26c30748a9716a1b7dd51d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372b8bce145d4df128ccb3005114663e054a28c52d9e46000689fcb6ed0e07bf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"ension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849160 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1003 14:01:11.849183 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1003 14:01:11.849192 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1003 14:01:11.849245 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759500056\\\\\\\\\\\\\\\" (2025-10-03 14:00:55 +0000 UTC to 2025-11-02 14:00:56 +0000 UTC (now=2025-10-03 14:01:11.849218966 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849276 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1003 14:01:11.849287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1003 14:01:11.849419 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759500071\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759500071\\\\\\\\\\\\\\\" (2025-10-03 13:01:11 +0000 UTC to 2026-10-03 13:01:11 +0000 UTC (now=2025-10-03 14:01:11.84940306 +0000 UTC))\\\\\\\"\\\\nI1003 14:01:11.849459 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1003 14:01:11.849495 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1003 14:01:11.849528 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2770642474/tls.crt::/tmp/serving-cert-2770642474/tls.key\\\\\\\"\\\\nI1003 14:01:11.849714 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c98d1e8f959e428d9a766ccee9b73a8aadc4565d2630153fb2e80c863a4fc1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4d7eed72086795a72b8b87fde9cbf207c28485c9137d969369de7655a5bc69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.821965 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01de5e2b-fdd5-441c-b16a-2a8c23d8520a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae09654ffcd8f59fc1ea875ec17a86d9e421644363bb7b761ee5c32d52760fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4c344be5f6e3ff9611f5000d87254fda72718ca79eda40aa735ba5f1bd95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cf3f403152468f7c3ab024e00ff69e54557a194ac3708c01d993c5f46ff58a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e22134c583718c15c3835037d28f505ebc3bf8961e9ecd3fb6c3e06ce03e8f89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.829596 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.829627 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.829636 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.829650 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.829658 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:30Z","lastTransitionTime":"2025-10-03T14:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.833816 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.846087 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30b5eec196a240b5b86972fbf16d4096c33d21297300085edcd84de46114def5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.856315 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xf7xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686acf3e-9445-4e3f-9a49-d714556a8e52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1af103e431490af276b73ccadce501759f87e63ca4d23e5f48c214a26ef3f834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xf7xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.866210 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r9xm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61e13aef-fd75-4e3e-a84d-44093600f786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e8da7fa32845e4c02143aba22286fa8eea8223e9069da182937b5f148da827a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2ksq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r9xm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.877172 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f6d0d7eed71096ad853df3d00ffd85251264d3d1c709fe1af5aeed945f6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://756c96d959ba3e62cf797cb9f63cf9b8ebee8948077cf21f471399ca8f13a6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.894158 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lbt25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2470111-1b59-4048-89ff-2b7e83659200\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b30fc0f6edd84ef5b469799bfa9719f0e6d2c8a18cbf2d25c0596da1a46cc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1abef7cc22f11e0db538f92dae8fa47b17398057d4951353f6447f87ed042a25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3e3adbf44b39d48698df09055e1ba386e0a4c7f3f5407f95ae3680d94cdc464\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ac32e184affa0eb297f656c89df5e50fc3ca562b28eb33f3164f2c90daf66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://950034d0d9d43317388fb347ae9bdb8f2cb70d96eaf64fb06c63328cedb3f019\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0acb834c49c05e729ef03de6af33d915a63ea129339b6fe2fd1ef14204845483\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://091f39df87022abf000d492a81c3bc0c14a97e959f88729a9d727c8ae31d0bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svdtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lbt25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.904924 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5045a93a-725e-48c0-b553-2c10569de997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb259a3c7fa12a75c4729979def8392191091bb90da1bce6bb7112e0a5995392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b3819c73c2b8e54026a6a13ab75376992530ca159f858bdafa1183a699beb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d8b85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j5vpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.905221 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:30 crc kubenswrapper[4636]: E1003 14:02:30.905345 4636 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:02:30 crc kubenswrapper[4636]: E1003 14:02:30.905389 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs podName:a7f8fb91-fbef-43b5-b771-f376cfbb1cdd nodeName:}" failed. No retries permitted until 2025-10-03 14:03:34.905376102 +0000 UTC m=+164.764102349 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs") pod "network-metrics-daemon-vm9z7" (UID: "a7f8fb91-fbef-43b5-b771-f376cfbb1cdd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.916217 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13bee4e3-7f64-4357-bed9-745c0d0dd6ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff424a5b92bb8438a2e76122ffa5e05d457192648bf1f55f2297e76a7cb47d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41eb1cae663ea79ad7f010d97d32eed262031f3adf1456ab1809fcb31692876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c40181cc78922c75c7afc6b5ed746041476c8d5c363cd05d818d898485cb08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac62042dbcf663afd17de357d0084236a70cc9acded2bd9a7fb1b8dfc14b32f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.927132 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.931518 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.931547 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.931557 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.931572 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.931582 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:30Z","lastTransitionTime":"2025-10-03T14:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.943615 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564529e3-ff40-4923-9f6d-319a9b41720a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:11Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 28.926067ms\\\\nI1003 14:02:11.918695 6573 factory.go:1336] Added *v1.Node event handler 7\\\\nI1003 14:02:11.918762 6573 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1003 14:02:11.919169 6573 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 14:02:11.919255 6573 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 14:02:11.919293 6573 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 14:02:11.919384 6573 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 14:02:11.919400 6573 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 14:02:11.919553 6573 factory.go:656] Stopping watch factory\\\\nI1003 14:02:11.919586 6573 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 14:02:11.919280 6573 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 14:02:11.919658 6573 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 14:02:11.919672 6573 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 14:02:11.919692 6573 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 14:02:11.924322 6573 ovnkube.go:599] Stopped ovnkube\\\\nI1003 14:02:11.924380 6573 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 14:02:11.924470 6573 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:02:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:01:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p9qj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7xd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.952651 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fb48b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm9z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.961531 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed9d52d-9394-4f9f-b1e4-ee7b4481a530\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://449f08b53e055181f2144302dcff762922e28aaa605b9256e9c0e0d4b2027413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5de2406a1c7eb859a4433c77e351aeefe545517d1fe3bf914419b6db29a6a44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5de2406a1c7eb859a4433c77e351aeefe545517d1fe3bf914419b6db29a6a44c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:30 crc kubenswrapper[4636]: I1003 14:02:30.979413 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8be0beac-7f0e-492d-a776-3aed571292ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c57804a5fe21cefac57519b8f918bf23674bdfd79a85b0ce4656c06d1fe147e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0926f422e599406a516b61935ce9563022c3e9e7ad33a9ca950f1c058ced436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ba1eff8ce1262a298ccec613cc114e0b4d56cd27980d398de9f85418daa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4f38d1718147c948d0c47a36c036bef42c63510f7f88a4e5a64016afcd4c0ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e0ff2beb74f7fbf7fa9d399206c8f11329cbcf607af0cc6dbeb6ff1a80b473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99646a04d7ebb8fe9b23012d70c0a8a05cbcd8c7cbc71cc4e276063575920152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99646a04d7ebb8fe9b23012d70c0a8a05cbcd8c7cbc71cc4e276063575920152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad4cdce98deaf19c3e62a0bad33bf255718bfc9f3c0872d9d90332a83c86bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bad4cdce98deaf19c3e62a0bad33bf255718bfc9f3c0872d9d90332a83c86bba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6d7a189fa3e1baec50c682556ca6f00c21773804bd51f2e34a599f39c801fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6d7a189fa3e1baec50c682556ca6f00c21773804bd51f2e34a599f39c801fd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T14:00:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T14:00:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:00:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.001355 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38833e3e2a4fa75025926a3c3500154eb370fb9805d8f877e0feea64d8d0de5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:30Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.022743 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.037008 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.037042 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.037054 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.037077 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.037109 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:31Z","lastTransitionTime":"2025-10-03T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.046651 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"140a698f-2661-4dc8-86d9-929b0d6dd326\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:02:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T14:02:03Z\\\",\\\"message\\\":\\\"2025-10-03T14:01:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1\\\\n2025-10-03T14:01:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e5679ce-597f-479f-bebe-cabc4f1dd8f1 to /host/opt/cni/bin/\\\\n2025-10-03T14:01:18Z [verbose] multus-daemon started\\\\n2025-10-03T14:01:18Z [verbose] Readiness Indicator file check\\\\n2025-10-03T14:02:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:02:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5fmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.057844 4636 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f078d6dd-d81e-4a06-aca1-508bf23a2170\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T14:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b195d650e70b0a8d5bab7bab5f5d49a189e19004be54afc073a060c60a49c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T14:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmm8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T14:01:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ngmch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T14:02:31Z is after 2025-08-24T17:21:41Z" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.139741 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.139779 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.139788 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.139803 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.139813 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:31Z","lastTransitionTime":"2025-10-03T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.241368 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.241407 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.241419 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.241434 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.241444 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:31Z","lastTransitionTime":"2025-10-03T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.343972 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.344008 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.344019 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.344035 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.344046 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:31Z","lastTransitionTime":"2025-10-03T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.446287 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.446337 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.446348 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.446365 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.446379 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:31Z","lastTransitionTime":"2025-10-03T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.548755 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.548800 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.548812 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.548830 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.548841 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:31Z","lastTransitionTime":"2025-10-03T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.651296 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.651328 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.651338 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.651353 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.651363 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:31Z","lastTransitionTime":"2025-10-03T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.754177 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.754224 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.754257 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.754273 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.754283 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:31Z","lastTransitionTime":"2025-10-03T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.857068 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.857148 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.857159 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.857175 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.857185 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:31Z","lastTransitionTime":"2025-10-03T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.959813 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.959875 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.959891 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.959915 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:31 crc kubenswrapper[4636]: I1003 14:02:31.959933 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:31Z","lastTransitionTime":"2025-10-03T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.062517 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.062562 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.062575 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.062595 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.062611 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:32Z","lastTransitionTime":"2025-10-03T14:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.165189 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.165221 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.165230 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.165245 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.165254 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:32Z","lastTransitionTime":"2025-10-03T14:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.267594 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.267654 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.267667 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.267688 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.267703 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:32Z","lastTransitionTime":"2025-10-03T14:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.369782 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.369830 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.369841 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.369857 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.369867 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:32Z","lastTransitionTime":"2025-10-03T14:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.472328 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.472378 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.472393 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.472410 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.472419 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:32Z","lastTransitionTime":"2025-10-03T14:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.574772 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.574826 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.574839 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.574858 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.574870 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:32Z","lastTransitionTime":"2025-10-03T14:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.677341 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.677418 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.677430 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.677451 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.677467 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:32Z","lastTransitionTime":"2025-10-03T14:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.780509 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.780550 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.780559 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.780574 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.780584 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:32Z","lastTransitionTime":"2025-10-03T14:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.793328 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.793357 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.793525 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.793666 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:32 crc kubenswrapper[4636]: E1003 14:02:32.793749 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:32 crc kubenswrapper[4636]: E1003 14:02:32.793941 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:32 crc kubenswrapper[4636]: E1003 14:02:32.794027 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:32 crc kubenswrapper[4636]: E1003 14:02:32.794071 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.883437 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.883490 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.883500 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.883515 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.883526 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:32Z","lastTransitionTime":"2025-10-03T14:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.985859 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.985931 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.985941 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.985954 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:32 crc kubenswrapper[4636]: I1003 14:02:32.985963 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:32Z","lastTransitionTime":"2025-10-03T14:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.093475 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.093545 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.093560 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.093609 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.093622 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:33Z","lastTransitionTime":"2025-10-03T14:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.196942 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.196995 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.197008 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.197028 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.197043 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:33Z","lastTransitionTime":"2025-10-03T14:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.298776 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.298814 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.298825 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.298841 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.298851 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:33Z","lastTransitionTime":"2025-10-03T14:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.401539 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.401580 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.401592 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.401608 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.401620 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:33Z","lastTransitionTime":"2025-10-03T14:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.504881 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.504927 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.504936 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.504952 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.504963 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:33Z","lastTransitionTime":"2025-10-03T14:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.608310 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.608381 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.608400 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.608428 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.608450 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:33Z","lastTransitionTime":"2025-10-03T14:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.712127 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.712195 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.712216 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.712243 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.712264 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:33Z","lastTransitionTime":"2025-10-03T14:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.815903 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.815982 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.816001 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.816035 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.816054 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:33Z","lastTransitionTime":"2025-10-03T14:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.919030 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.919077 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.919087 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.919119 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:33 crc kubenswrapper[4636]: I1003 14:02:33.919132 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:33Z","lastTransitionTime":"2025-10-03T14:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.021360 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.021750 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.021856 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.021957 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.022037 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:34Z","lastTransitionTime":"2025-10-03T14:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.124875 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.124966 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.124983 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.125031 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.125049 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:34Z","lastTransitionTime":"2025-10-03T14:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.227219 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.227287 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.227298 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.227315 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.227326 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:34Z","lastTransitionTime":"2025-10-03T14:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.329644 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.329683 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.329701 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.329724 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.329737 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:34Z","lastTransitionTime":"2025-10-03T14:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.431991 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.432051 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.432082 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.432125 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.432138 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:34Z","lastTransitionTime":"2025-10-03T14:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.534174 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.534216 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.534228 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.534243 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.534252 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:34Z","lastTransitionTime":"2025-10-03T14:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.643418 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.643456 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.643469 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.643484 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.643495 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:34Z","lastTransitionTime":"2025-10-03T14:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.747261 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.747335 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.747352 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.747382 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.747405 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:34Z","lastTransitionTime":"2025-10-03T14:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.793746 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:34 crc kubenswrapper[4636]: E1003 14:02:34.793950 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.794027 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:34 crc kubenswrapper[4636]: E1003 14:02:34.794092 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.794154 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:34 crc kubenswrapper[4636]: E1003 14:02:34.794213 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.795220 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:34 crc kubenswrapper[4636]: E1003 14:02:34.795396 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.849568 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.849624 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.849639 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.849661 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.849768 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:34Z","lastTransitionTime":"2025-10-03T14:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.951770 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.951814 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.951825 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.951840 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:34 crc kubenswrapper[4636]: I1003 14:02:34.951849 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:34Z","lastTransitionTime":"2025-10-03T14:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.054186 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.054228 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.054240 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.054254 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.054265 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:35Z","lastTransitionTime":"2025-10-03T14:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.155972 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.156006 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.156016 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.156032 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.156044 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:35Z","lastTransitionTime":"2025-10-03T14:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.257857 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.257887 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.257897 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.257910 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.257919 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:35Z","lastTransitionTime":"2025-10-03T14:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.360362 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.360400 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.360410 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.360427 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.360438 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:35Z","lastTransitionTime":"2025-10-03T14:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.462381 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.462417 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.462427 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.462443 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.462453 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:35Z","lastTransitionTime":"2025-10-03T14:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.564705 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.564777 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.564788 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.564823 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.564834 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:35Z","lastTransitionTime":"2025-10-03T14:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.667770 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.667821 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.667833 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.667850 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.667863 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:35Z","lastTransitionTime":"2025-10-03T14:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.770931 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.770983 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.770993 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.771011 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.771024 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:35Z","lastTransitionTime":"2025-10-03T14:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.874189 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.874229 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.874238 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.874255 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.874265 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:35Z","lastTransitionTime":"2025-10-03T14:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.985680 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.985742 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.985754 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.985776 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:35 crc kubenswrapper[4636]: I1003 14:02:35.985789 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:35Z","lastTransitionTime":"2025-10-03T14:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.088469 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.088569 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.088580 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.088601 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.088613 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:36Z","lastTransitionTime":"2025-10-03T14:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.191226 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.191272 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.191287 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.191305 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.191317 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:36Z","lastTransitionTime":"2025-10-03T14:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.293655 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.293707 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.293724 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.293743 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.293754 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:36Z","lastTransitionTime":"2025-10-03T14:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.395507 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.395566 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.395580 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.395597 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.395609 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:36Z","lastTransitionTime":"2025-10-03T14:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.497983 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.498029 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.498038 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.498055 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.498069 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:36Z","lastTransitionTime":"2025-10-03T14:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.600638 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.600674 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.600683 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.600697 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.600706 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:36Z","lastTransitionTime":"2025-10-03T14:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.703518 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.703575 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.703592 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.703615 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.703627 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:36Z","lastTransitionTime":"2025-10-03T14:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.793056 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.793073 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.793283 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.793293 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:36 crc kubenswrapper[4636]: E1003 14:02:36.793388 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:36 crc kubenswrapper[4636]: E1003 14:02:36.793470 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:36 crc kubenswrapper[4636]: E1003 14:02:36.793513 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:36 crc kubenswrapper[4636]: E1003 14:02:36.793605 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.806415 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.806464 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.806476 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.806494 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.806505 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:36Z","lastTransitionTime":"2025-10-03T14:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.908660 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.908723 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.908743 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.908772 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:36 crc kubenswrapper[4636]: I1003 14:02:36.908796 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:36Z","lastTransitionTime":"2025-10-03T14:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.010838 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.010902 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.010915 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.010936 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.010946 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:37Z","lastTransitionTime":"2025-10-03T14:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.113042 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.113083 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.113091 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.113120 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.113130 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:37Z","lastTransitionTime":"2025-10-03T14:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.215810 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.215846 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.215857 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.215874 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.215886 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:37Z","lastTransitionTime":"2025-10-03T14:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.317776 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.317847 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.317862 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.317887 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.317931 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:37Z","lastTransitionTime":"2025-10-03T14:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.420296 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.420330 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.420339 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.420353 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.420361 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:37Z","lastTransitionTime":"2025-10-03T14:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.523036 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.523139 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.523153 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.523168 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.523177 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:37Z","lastTransitionTime":"2025-10-03T14:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.625144 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.625197 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.625208 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.625229 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.625240 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:37Z","lastTransitionTime":"2025-10-03T14:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.727023 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.727149 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.727160 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.727177 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.727191 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:37Z","lastTransitionTime":"2025-10-03T14:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.829949 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.829994 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.830004 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.830019 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.830028 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:37Z","lastTransitionTime":"2025-10-03T14:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.932913 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.932956 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.932967 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.932985 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:37 crc kubenswrapper[4636]: I1003 14:02:37.932997 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:37Z","lastTransitionTime":"2025-10-03T14:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.036076 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.036170 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.036182 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.036199 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.036213 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:38Z","lastTransitionTime":"2025-10-03T14:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.138955 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.139011 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.139025 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.139055 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.139072 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:38Z","lastTransitionTime":"2025-10-03T14:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.242275 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.242379 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.242398 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.242421 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.242441 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:38Z","lastTransitionTime":"2025-10-03T14:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.347334 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.347386 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.347401 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.347421 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.347431 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:38Z","lastTransitionTime":"2025-10-03T14:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.450290 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.450348 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.450365 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.450384 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.450400 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:38Z","lastTransitionTime":"2025-10-03T14:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.553625 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.553680 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.553702 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.553724 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.553740 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:38Z","lastTransitionTime":"2025-10-03T14:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.656154 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.656198 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.656209 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.656224 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.656232 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:38Z","lastTransitionTime":"2025-10-03T14:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.758019 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.758061 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.758073 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.758089 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.758121 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:38Z","lastTransitionTime":"2025-10-03T14:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.792994 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:38 crc kubenswrapper[4636]: E1003 14:02:38.793172 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.793330 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.793412 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:38 crc kubenswrapper[4636]: E1003 14:02:38.793472 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.793575 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:38 crc kubenswrapper[4636]: E1003 14:02:38.793646 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:38 crc kubenswrapper[4636]: E1003 14:02:38.793720 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.860586 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.860634 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.860650 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.860672 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.860690 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:38Z","lastTransitionTime":"2025-10-03T14:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.962557 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.962594 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.962605 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.962623 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:38 crc kubenswrapper[4636]: I1003 14:02:38.962637 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:38Z","lastTransitionTime":"2025-10-03T14:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.065136 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.065183 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.065197 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.065215 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.065228 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:39Z","lastTransitionTime":"2025-10-03T14:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.167821 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.167863 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.167872 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.167887 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.167896 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:39Z","lastTransitionTime":"2025-10-03T14:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.270774 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.270836 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.270846 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.270882 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.270894 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:39Z","lastTransitionTime":"2025-10-03T14:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.373367 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.373429 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.373438 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.373451 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.373460 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:39Z","lastTransitionTime":"2025-10-03T14:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.475668 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.475698 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.475714 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.475730 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.475742 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:39Z","lastTransitionTime":"2025-10-03T14:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.513945 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.513998 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.514010 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.514030 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.514042 4636 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T14:02:39Z","lastTransitionTime":"2025-10-03T14:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.566828 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq"] Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.567210 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.570864 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.570925 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.570943 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.571208 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.618346 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=86.618323229 podStartE2EDuration="1m26.618323229s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:39.601566652 +0000 UTC m=+109.460292899" watchObservedRunningTime="2025-10-03 14:02:39.618323229 +0000 UTC m=+109.477049476" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.635890 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=54.635847446 podStartE2EDuration="54.635847446s" podCreationTimestamp="2025-10-03 14:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:39.618757221 +0000 UTC m=+109.477483468" watchObservedRunningTime="2025-10-03 14:02:39.635847446 +0000 UTC m=+109.494573693" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.663546 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xf7xs" podStartSLOduration=87.663527902 podStartE2EDuration="1m27.663527902s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:39.662993709 +0000 UTC m=+109.521719966" watchObservedRunningTime="2025-10-03 14:02:39.663527902 +0000 UTC m=+109.522254149" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.691600 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r9xm2" podStartSLOduration=87.691569557 podStartE2EDuration="1m27.691569557s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:39.676750359 +0000 UTC m=+109.535476596" watchObservedRunningTime="2025-10-03 14:02:39.691569557 +0000 UTC m=+109.550309315" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.698469 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.698515 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.698543 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.698593 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.698627 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.708169 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lbt25" podStartSLOduration=87.70814884 podStartE2EDuration="1m27.70814884s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:39.707522804 +0000 UTC m=+109.566249041" watchObservedRunningTime="2025-10-03 14:02:39.70814884 +0000 UTC m=+109.566875097" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.724088 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j5vpq" podStartSLOduration=86.724060146 podStartE2EDuration="1m26.724060146s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:39.723519862 +0000 UTC m=+109.582246129" watchObservedRunningTime="2025-10-03 14:02:39.724060146 +0000 UTC m=+109.582786403" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.755309 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=86.755291122 podStartE2EDuration="1m26.755291122s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:39.739252113 +0000 UTC m=+109.597978360" watchObservedRunningTime="2025-10-03 14:02:39.755291122 +0000 UTC m=+109.614017369" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.794169 4636 scope.go:117] "RemoveContainer" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" Oct 03 14:02:39 crc kubenswrapper[4636]: E1003 14:02:39.794335 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7xd5_openshift-ovn-kubernetes(564529e3-ff40-4923-9f6d-319a9b41720a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.799723 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.799762 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.799792 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.799824 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.799901 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.799906 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.799965 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.801207 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.811713 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.812072 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.812055639 podStartE2EDuration="23.812055639s" podCreationTimestamp="2025-10-03 14:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:39.811412873 +0000 UTC m=+109.670139120" watchObservedRunningTime="2025-10-03 14:02:39.812055639 +0000 UTC m=+109.670781886" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.825721 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5edf38b9-14dc-418d-9cd0-e10965b0bc9f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zzjlq\" (UID: \"5edf38b9-14dc-418d-9cd0-e10965b0bc9f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.836539 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=18.836517853 podStartE2EDuration="18.836517853s" podCreationTimestamp="2025-10-03 14:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:39.836247426 +0000 UTC m=+109.694973673" watchObservedRunningTime="2025-10-03 14:02:39.836517853 +0000 UTC m=+109.695244100" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.883997 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.885326 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ltsq6" podStartSLOduration=87.885309107 podStartE2EDuration="1m27.885309107s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:39.884762533 +0000 UTC m=+109.743488780" watchObservedRunningTime="2025-10-03 14:02:39.885309107 +0000 UTC m=+109.744035354" Oct 03 14:02:39 crc kubenswrapper[4636]: I1003 14:02:39.904684 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podStartSLOduration=87.904668131 podStartE2EDuration="1m27.904668131s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:39.903830329 +0000 UTC m=+109.762556576" watchObservedRunningTime="2025-10-03 14:02:39.904668131 +0000 UTC m=+109.763394378" Oct 03 14:02:40 crc kubenswrapper[4636]: I1003 14:02:40.317986 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" event={"ID":"5edf38b9-14dc-418d-9cd0-e10965b0bc9f","Type":"ContainerStarted","Data":"d47d8fb20a495bf6d830144d3b050ab003e9f67fd17293a224b6b859f45ae3b7"} Oct 03 14:02:40 crc kubenswrapper[4636]: I1003 14:02:40.318047 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" event={"ID":"5edf38b9-14dc-418d-9cd0-e10965b0bc9f","Type":"ContainerStarted","Data":"4d1000b14a68cebbf2e7e75471b8afda584e7d49a76879def556978fe2fec63b"} Oct 03 14:02:40 crc kubenswrapper[4636]: I1003 14:02:40.793157 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:40 crc kubenswrapper[4636]: I1003 14:02:40.793170 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:40 crc kubenswrapper[4636]: I1003 14:02:40.793216 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:40 crc kubenswrapper[4636]: I1003 14:02:40.793262 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:40 crc kubenswrapper[4636]: E1003 14:02:40.794053 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:40 crc kubenswrapper[4636]: E1003 14:02:40.794290 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:40 crc kubenswrapper[4636]: E1003 14:02:40.794433 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:40 crc kubenswrapper[4636]: E1003 14:02:40.794593 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:42 crc kubenswrapper[4636]: I1003 14:02:42.793747 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:42 crc kubenswrapper[4636]: I1003 14:02:42.793806 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:42 crc kubenswrapper[4636]: E1003 14:02:42.793893 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:42 crc kubenswrapper[4636]: I1003 14:02:42.793747 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:42 crc kubenswrapper[4636]: I1003 14:02:42.793940 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:42 crc kubenswrapper[4636]: E1003 14:02:42.794042 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:42 crc kubenswrapper[4636]: E1003 14:02:42.794177 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:42 crc kubenswrapper[4636]: E1003 14:02:42.794254 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:44 crc kubenswrapper[4636]: I1003 14:02:44.793605 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:44 crc kubenswrapper[4636]: I1003 14:02:44.793657 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:44 crc kubenswrapper[4636]: I1003 14:02:44.793626 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:44 crc kubenswrapper[4636]: E1003 14:02:44.793739 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:44 crc kubenswrapper[4636]: I1003 14:02:44.793611 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:44 crc kubenswrapper[4636]: E1003 14:02:44.793907 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:44 crc kubenswrapper[4636]: E1003 14:02:44.794021 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:44 crc kubenswrapper[4636]: E1003 14:02:44.794062 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:46 crc kubenswrapper[4636]: I1003 14:02:46.793233 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:46 crc kubenswrapper[4636]: I1003 14:02:46.793272 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:46 crc kubenswrapper[4636]: I1003 14:02:46.793691 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:46 crc kubenswrapper[4636]: E1003 14:02:46.793738 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:46 crc kubenswrapper[4636]: E1003 14:02:46.793872 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:46 crc kubenswrapper[4636]: I1003 14:02:46.793902 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:46 crc kubenswrapper[4636]: E1003 14:02:46.793950 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:46 crc kubenswrapper[4636]: E1003 14:02:46.794030 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:48 crc kubenswrapper[4636]: I1003 14:02:48.793012 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:48 crc kubenswrapper[4636]: I1003 14:02:48.793129 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:48 crc kubenswrapper[4636]: I1003 14:02:48.793277 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:48 crc kubenswrapper[4636]: I1003 14:02:48.793339 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:48 crc kubenswrapper[4636]: E1003 14:02:48.793337 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:48 crc kubenswrapper[4636]: E1003 14:02:48.793181 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:48 crc kubenswrapper[4636]: E1003 14:02:48.793520 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:48 crc kubenswrapper[4636]: E1003 14:02:48.793416 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:50 crc kubenswrapper[4636]: I1003 14:02:50.348451 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltsq6_140a698f-2661-4dc8-86d9-929b0d6dd326/kube-multus/1.log" Oct 03 14:02:50 crc kubenswrapper[4636]: I1003 14:02:50.348928 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltsq6_140a698f-2661-4dc8-86d9-929b0d6dd326/kube-multus/0.log" Oct 03 14:02:50 crc kubenswrapper[4636]: I1003 14:02:50.348966 4636 generic.go:334] "Generic (PLEG): container finished" podID="140a698f-2661-4dc8-86d9-929b0d6dd326" containerID="f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85" exitCode=1 Oct 03 14:02:50 crc kubenswrapper[4636]: I1003 14:02:50.348996 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltsq6" event={"ID":"140a698f-2661-4dc8-86d9-929b0d6dd326","Type":"ContainerDied","Data":"f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85"} Oct 03 14:02:50 crc kubenswrapper[4636]: I1003 14:02:50.349030 4636 scope.go:117] "RemoveContainer" containerID="45ff9eb95f17fc35cd069f815224c489fde9f7a822a85440679d155045d46b57" Oct 03 14:02:50 crc kubenswrapper[4636]: I1003 14:02:50.350256 4636 scope.go:117] "RemoveContainer" containerID="f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85" Oct 03 14:02:50 crc kubenswrapper[4636]: E1003 14:02:50.350442 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-ltsq6_openshift-multus(140a698f-2661-4dc8-86d9-929b0d6dd326)\"" pod="openshift-multus/multus-ltsq6" podUID="140a698f-2661-4dc8-86d9-929b0d6dd326" Oct 03 14:02:50 crc kubenswrapper[4636]: I1003 14:02:50.373369 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zzjlq" podStartSLOduration=98.373357295 podStartE2EDuration="1m38.373357295s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:40.341618494 +0000 UTC m=+110.200344741" watchObservedRunningTime="2025-10-03 14:02:50.373357295 +0000 UTC m=+120.232083542" Oct 03 14:02:50 crc kubenswrapper[4636]: E1003 14:02:50.764011 4636 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 03 14:02:50 crc kubenswrapper[4636]: I1003 14:02:50.793263 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:50 crc kubenswrapper[4636]: E1003 14:02:50.794146 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:50 crc kubenswrapper[4636]: I1003 14:02:50.794219 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:50 crc kubenswrapper[4636]: I1003 14:02:50.794266 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:50 crc kubenswrapper[4636]: I1003 14:02:50.794263 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:50 crc kubenswrapper[4636]: E1003 14:02:50.794376 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:50 crc kubenswrapper[4636]: E1003 14:02:50.794491 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:50 crc kubenswrapper[4636]: E1003 14:02:50.794575 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:50 crc kubenswrapper[4636]: E1003 14:02:50.902983 4636 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 14:02:51 crc kubenswrapper[4636]: I1003 14:02:51.354876 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltsq6_140a698f-2661-4dc8-86d9-929b0d6dd326/kube-multus/1.log" Oct 03 14:02:52 crc kubenswrapper[4636]: I1003 14:02:52.793798 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:52 crc kubenswrapper[4636]: I1003 14:02:52.793805 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:52 crc kubenswrapper[4636]: E1003 14:02:52.794781 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:52 crc kubenswrapper[4636]: I1003 14:02:52.793880 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:52 crc kubenswrapper[4636]: E1003 14:02:52.794871 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:52 crc kubenswrapper[4636]: I1003 14:02:52.793798 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:52 crc kubenswrapper[4636]: E1003 14:02:52.794930 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:52 crc kubenswrapper[4636]: E1003 14:02:52.794688 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:53 crc kubenswrapper[4636]: I1003 14:02:53.794204 4636 scope.go:117] "RemoveContainer" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" Oct 03 14:02:54 crc kubenswrapper[4636]: I1003 14:02:54.368884 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/3.log" Oct 03 14:02:54 crc kubenswrapper[4636]: I1003 14:02:54.371983 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerStarted","Data":"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1"} Oct 03 14:02:54 crc kubenswrapper[4636]: I1003 14:02:54.372887 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:02:54 crc kubenswrapper[4636]: I1003 14:02:54.794007 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:54 crc kubenswrapper[4636]: E1003 14:02:54.796203 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:54 crc kubenswrapper[4636]: I1003 14:02:54.796032 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:54 crc kubenswrapper[4636]: E1003 14:02:54.797318 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:54 crc kubenswrapper[4636]: I1003 14:02:54.796013 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:54 crc kubenswrapper[4636]: E1003 14:02:54.797728 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:54 crc kubenswrapper[4636]: I1003 14:02:54.796051 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:54 crc kubenswrapper[4636]: E1003 14:02:54.798038 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:54 crc kubenswrapper[4636]: I1003 14:02:54.972411 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podStartSLOduration=102.972393869 podStartE2EDuration="1m42.972393869s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:02:54.411773892 +0000 UTC m=+124.270500139" watchObservedRunningTime="2025-10-03 14:02:54.972393869 +0000 UTC m=+124.831120116" Oct 03 14:02:54 crc kubenswrapper[4636]: I1003 14:02:54.973463 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vm9z7"] Oct 03 14:02:55 crc kubenswrapper[4636]: I1003 14:02:55.374377 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:55 crc kubenswrapper[4636]: E1003 14:02:55.374688 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:55 crc kubenswrapper[4636]: E1003 14:02:55.904385 4636 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 14:02:56 crc kubenswrapper[4636]: I1003 14:02:56.793323 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:56 crc kubenswrapper[4636]: I1003 14:02:56.793332 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:56 crc kubenswrapper[4636]: I1003 14:02:56.793440 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:56 crc kubenswrapper[4636]: E1003 14:02:56.793512 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:02:56 crc kubenswrapper[4636]: E1003 14:02:56.793603 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:56 crc kubenswrapper[4636]: E1003 14:02:56.793656 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:56 crc kubenswrapper[4636]: I1003 14:02:56.793352 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:56 crc kubenswrapper[4636]: E1003 14:02:56.794429 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:58 crc kubenswrapper[4636]: I1003 14:02:58.793774 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:02:58 crc kubenswrapper[4636]: I1003 14:02:58.793803 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:02:58 crc kubenswrapper[4636]: I1003 14:02:58.793790 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:02:58 crc kubenswrapper[4636]: E1003 14:02:58.793908 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:02:58 crc kubenswrapper[4636]: I1003 14:02:58.793782 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:02:58 crc kubenswrapper[4636]: E1003 14:02:58.794013 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:02:58 crc kubenswrapper[4636]: E1003 14:02:58.794119 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:02:58 crc kubenswrapper[4636]: E1003 14:02:58.794186 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:03:00 crc kubenswrapper[4636]: I1003 14:03:00.793267 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:03:00 crc kubenswrapper[4636]: I1003 14:03:00.793295 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:03:00 crc kubenswrapper[4636]: I1003 14:03:00.793267 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:03:00 crc kubenswrapper[4636]: I1003 14:03:00.794237 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:03:00 crc kubenswrapper[4636]: E1003 14:03:00.794233 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:03:00 crc kubenswrapper[4636]: E1003 14:03:00.794363 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:03:00 crc kubenswrapper[4636]: E1003 14:03:00.794623 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:03:00 crc kubenswrapper[4636]: E1003 14:03:00.794767 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:03:00 crc kubenswrapper[4636]: E1003 14:03:00.905504 4636 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 14:03:02 crc kubenswrapper[4636]: I1003 14:03:02.793756 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:03:02 crc kubenswrapper[4636]: E1003 14:03:02.794009 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:03:02 crc kubenswrapper[4636]: I1003 14:03:02.794142 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:03:02 crc kubenswrapper[4636]: I1003 14:03:02.794144 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:03:02 crc kubenswrapper[4636]: I1003 14:03:02.794239 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:03:02 crc kubenswrapper[4636]: E1003 14:03:02.794268 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:03:02 crc kubenswrapper[4636]: E1003 14:03:02.794549 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:03:02 crc kubenswrapper[4636]: E1003 14:03:02.794877 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:03:02 crc kubenswrapper[4636]: I1003 14:03:02.795478 4636 scope.go:117] "RemoveContainer" containerID="f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85" Oct 03 14:03:03 crc kubenswrapper[4636]: I1003 14:03:03.406421 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltsq6_140a698f-2661-4dc8-86d9-929b0d6dd326/kube-multus/1.log" Oct 03 14:03:03 crc kubenswrapper[4636]: I1003 14:03:03.406702 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltsq6" event={"ID":"140a698f-2661-4dc8-86d9-929b0d6dd326","Type":"ContainerStarted","Data":"1b6aa2e19ac2f9f087fab0b525d8c3d4b09b610b1fa0aa8608d6083dcd243173"} Oct 03 14:03:04 crc kubenswrapper[4636]: I1003 14:03:04.792894 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:03:04 crc kubenswrapper[4636]: I1003 14:03:04.792939 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:03:04 crc kubenswrapper[4636]: I1003 14:03:04.792953 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:03:04 crc kubenswrapper[4636]: I1003 14:03:04.792940 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:03:04 crc kubenswrapper[4636]: E1003 14:03:04.793036 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 14:03:04 crc kubenswrapper[4636]: E1003 14:03:04.793093 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 14:03:04 crc kubenswrapper[4636]: E1003 14:03:04.793232 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm9z7" podUID="a7f8fb91-fbef-43b5-b771-f376cfbb1cdd" Oct 03 14:03:04 crc kubenswrapper[4636]: E1003 14:03:04.793390 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 14:03:06 crc kubenswrapper[4636]: I1003 14:03:06.793276 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:03:06 crc kubenswrapper[4636]: I1003 14:03:06.793356 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:03:06 crc kubenswrapper[4636]: I1003 14:03:06.793319 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:03:06 crc kubenswrapper[4636]: I1003 14:03:06.793287 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:03:06 crc kubenswrapper[4636]: I1003 14:03:06.795945 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 14:03:06 crc kubenswrapper[4636]: I1003 14:03:06.796603 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 14:03:06 crc kubenswrapper[4636]: I1003 14:03:06.796719 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 14:03:06 crc kubenswrapper[4636]: I1003 14:03:06.797348 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 14:03:06 crc kubenswrapper[4636]: I1003 14:03:06.797385 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 14:03:06 crc kubenswrapper[4636]: I1003 14:03:06.797429 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.430144 4636 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.459471 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qzkgg"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.460449 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.462341 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.463161 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.463873 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.464406 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.465652 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2p8qq"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.480293 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.485912 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5q7j6"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.495941 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.496486 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.496899 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.496912 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5q7j6" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.497601 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.497908 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.498055 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.498182 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.498281 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.498380 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.498479 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.498581 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.498745 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.498870 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.498997 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.499136 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.501111 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.501260 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.501504 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.502493 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-lqxss"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.503054 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.504636 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.505278 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.507594 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.509769 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.510411 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vzs54"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.510697 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srw4g"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.511027 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.511236 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.511668 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.512749 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fzp9w"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.513062 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.513087 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.514423 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qzkgg"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.514521 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.514847 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.531888 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.531999 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.532423 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.533072 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.535236 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.535440 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.535553 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.535660 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.535709 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.535807 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.535848 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.535882 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.535954 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536000 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536086 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536121 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536197 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536230 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536266 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536350 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536420 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536429 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536488 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536551 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536634 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536703 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536776 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536845 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.536902 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537212 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537381 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537391 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537480 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537507 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537562 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537615 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537634 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537659 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537723 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537732 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537793 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.535960 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537825 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537870 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537805 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537959 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.537969 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.539552 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fcxkp"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.540036 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.540987 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.543716 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.548876 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.549873 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.550757 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.551066 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.551196 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.551305 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.551567 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.552150 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.552336 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.552442 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.552503 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mz2wd"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.552603 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.552697 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.553004 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.553134 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.553216 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.553249 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.553283 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.553396 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.567315 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.572291 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.572842 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.573918 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.575455 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.575459 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.576746 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.577329 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.577593 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.577643 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.577764 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.577929 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.590939 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.591961 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.591960 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.592762 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.593548 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.596223 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.597244 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-zvvrp"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.597721 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.597773 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.598235 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.610527 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.610527 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.612199 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.613587 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.613815 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.614207 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.614687 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.614842 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.615491 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.620550 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.620928 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.621015 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.623159 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.626687 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.627135 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.627410 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.628068 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.631181 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.631763 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.632172 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.632579 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.632595 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c9dcg"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.633512 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.635067 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pk6zb"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.635586 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.652158 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.655535 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.635594 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.657887 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-config\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661409 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6eefb2-0ce4-4501-b443-d06c45efd41f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fqn9d\" (UID: \"9d6eefb2-0ce4-4501-b443-d06c45efd41f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661441 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-trusted-ca-bundle\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661465 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7cf0aef3-c9be-4539-91bb-0a26d7d2a82e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6qz8k\" (UID: \"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661489 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13703c39-6eda-487f-9d53-509c6042d515-config\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661509 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-serving-cert\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661529 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfwwv\" (UniqueName: \"kubernetes.io/projected/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-kube-api-access-bfwwv\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661551 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-477ck\" (UniqueName: \"kubernetes.io/projected/3a0e6228-8a56-4b62-87f1-24eec9cffdd5-kube-api-access-477ck\") pod \"multus-admission-controller-857f4d67dd-fcxkp\" (UID: \"3a0e6228-8a56-4b62-87f1-24eec9cffdd5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661570 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-oauth-config\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661589 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftpjn\" (UniqueName: \"kubernetes.io/projected/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-kube-api-access-ftpjn\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661610 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661629 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-node-pullsecrets\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661645 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-audit-dir\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661664 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8cbc4a55-551d-4314-bcb0-751f82313dc0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661684 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d6eefb2-0ce4-4501-b443-d06c45efd41f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fqn9d\" (UID: \"9d6eefb2-0ce4-4501-b443-d06c45efd41f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661703 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-audit-policies\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661723 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661741 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-console-config\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661760 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0013e14d-2163-45f2-8a98-dbe6805e40d0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rsgnn\" (UID: \"0013e14d-2163-45f2-8a98-dbe6805e40d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661779 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb5t7\" (UniqueName: \"kubernetes.io/projected/0013e14d-2163-45f2-8a98-dbe6805e40d0-kube-api-access-pb5t7\") pod \"openshift-apiserver-operator-796bbdcf4f-rsgnn\" (UID: \"0013e14d-2163-45f2-8a98-dbe6805e40d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661818 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pj24\" (UniqueName: \"kubernetes.io/projected/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-kube-api-access-8pj24\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661837 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-trusted-ca\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661855 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e97026a-c5d6-4767-8ff5-54fad15f7a49-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tpww2\" (UID: \"6e97026a-c5d6-4767-8ff5-54fad15f7a49\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661877 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mnzr\" (UniqueName: \"kubernetes.io/projected/308225d5-c374-4bb6-a967-020bf6e7173f-kube-api-access-6mnzr\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661898 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661918 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-audit-dir\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661937 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/010ef4ac-9542-4a76-a005-385439b1045c-audit-dir\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661955 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-config\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661973 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9gjr4\" (UID: \"f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.661993 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/308225d5-c374-4bb6-a967-020bf6e7173f-service-ca-bundle\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662010 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-etcd-client\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662027 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.658953 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gnzm2"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662250 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psnzs\" (UniqueName: \"kubernetes.io/projected/0499c819-4b67-4882-9354-f7b9d6d2adc7-kube-api-access-psnzs\") pod \"control-plane-machine-set-operator-78cbb6b69f-r49hv\" (UID: \"0499c819-4b67-4882-9354-f7b9d6d2adc7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662356 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq9l9\" (UniqueName: \"kubernetes.io/projected/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-kube-api-access-dq9l9\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662573 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-service-ca-bundle\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662600 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-stats-auth\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662620 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmp2b\" (UniqueName: \"kubernetes.io/projected/e697897f-0594-48da-967d-e429421b8fec-kube-api-access-gmp2b\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662644 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-encryption-config\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662661 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbbb5\" (UniqueName: \"kubernetes.io/projected/0aa8bab3-482e-41e8-800e-9962a4146194-kube-api-access-fbbb5\") pod \"dns-operator-744455d44c-c9dcg\" (UID: \"0aa8bab3-482e-41e8-800e-9962a4146194\") " pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662711 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a0e6228-8a56-4b62-87f1-24eec9cffdd5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fcxkp\" (UID: \"3a0e6228-8a56-4b62-87f1-24eec9cffdd5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662732 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccdcn\" (UniqueName: \"kubernetes.io/projected/7cf0aef3-c9be-4539-91bb-0a26d7d2a82e-kube-api-access-ccdcn\") pod \"openshift-config-operator-7777fb866f-6qz8k\" (UID: \"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662753 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bjvx\" (UniqueName: \"kubernetes.io/projected/8cbc4a55-551d-4314-bcb0-751f82313dc0-kube-api-access-2bjvx\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662775 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5fee0926-3042-4015-ad02-90f4306431ae-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662795 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfzzt\" (UniqueName: \"kubernetes.io/projected/8c513f61-cee7-451f-b8a9-1dab425641a8-kube-api-access-mfzzt\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662821 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-serving-cert\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662841 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8d6ca41b-f3b4-4bce-ad85-0150cfeb2362-srv-cert\") pod \"catalog-operator-68c6474976-44xt7\" (UID: \"8d6ca41b-f3b4-4bce-ad85-0150cfeb2362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662968 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87t6p\" (UniqueName: \"kubernetes.io/projected/40e1f28c-6d64-4fa1-b554-507ff389f115-kube-api-access-87t6p\") pod \"downloads-7954f5f757-5q7j6\" (UID: \"40e1f28c-6d64-4fa1-b554-507ff389f115\") " pod="openshift-console/downloads-7954f5f757-5q7j6" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.662991 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.663011 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cbc4a55-551d-4314-bcb0-751f82313dc0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.663204 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thvtp\" (UniqueName: \"kubernetes.io/projected/9d6eefb2-0ce4-4501-b443-d06c45efd41f-kube-api-access-thvtp\") pod \"openshift-controller-manager-operator-756b6f6bc6-fqn9d\" (UID: \"9d6eefb2-0ce4-4501-b443-d06c45efd41f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.663325 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0013e14d-2163-45f2-8a98-dbe6805e40d0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rsgnn\" (UID: \"0013e14d-2163-45f2-8a98-dbe6805e40d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.671575 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.671907 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.672681 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ace060b-1d1d-4704-9655-78ab3599db2b-config\") pod \"kube-controller-manager-operator-78b949d7b-7fqgj\" (UID: \"1ace060b-1d1d-4704-9655-78ab3599db2b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.672720 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-etcd-client\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.672758 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0aa8bab3-482e-41e8-800e-9962a4146194-metrics-tls\") pod \"dns-operator-744455d44c-c9dcg\" (UID: \"0aa8bab3-482e-41e8-800e-9962a4146194\") " pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.672780 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5fee0926-3042-4015-ad02-90f4306431ae-images\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.672802 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-metrics-certs\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.672829 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.672832 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.672852 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-audit\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.673140 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.673148 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-serving-cert\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.673186 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64802881-57b2-4263-b5d8-f3c4c224c692-config-volume\") pod \"collect-profiles-29325000-fk4kf\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.673220 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f02b12d5-48d5-48ed-81c7-db4e06189afe-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tw6g9\" (UID: \"f02b12d5-48d5-48ed-81c7-db4e06189afe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.673277 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cbc4a55-551d-4314-bcb0-751f82313dc0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.673325 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2kjc\" (UniqueName: \"kubernetes.io/projected/010ef4ac-9542-4a76-a005-385439b1045c-kube-api-access-z2kjc\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.673498 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c513f61-cee7-451f-b8a9-1dab425641a8-serving-cert\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.673819 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf0aef3-c9be-4539-91bb-0a26d7d2a82e-serving-cert\") pod \"openshift-config-operator-7777fb866f-6qz8k\" (UID: \"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.673912 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.673984 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/308225d5-c374-4bb6-a967-020bf6e7173f-serving-cert\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674051 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674149 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-client-ca\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674218 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtwjd\" (UniqueName: \"kubernetes.io/projected/c83442ff-933c-4f99-aae8-522e4dc94199-kube-api-access-vtwjd\") pod \"cluster-samples-operator-665b6dd947-877d7\" (UID: \"c83442ff-933c-4f99-aae8-522e4dc94199\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674264 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674339 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5fee0926-3042-4015-ad02-90f4306431ae-proxy-tls\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674404 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e97026a-c5d6-4767-8ff5-54fad15f7a49-config\") pod \"kube-apiserver-operator-766d6c64bb-tpww2\" (UID: \"6e97026a-c5d6-4767-8ff5-54fad15f7a49\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674480 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-serving-cert\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674550 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674614 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-config\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674679 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0499c819-4b67-4882-9354-f7b9d6d2adc7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r49hv\" (UID: \"0499c819-4b67-4882-9354-f7b9d6d2adc7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674749 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b5dg\" (UniqueName: \"kubernetes.io/projected/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-kube-api-access-8b5dg\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674824 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.673749 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674970 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4dcf\" (UniqueName: \"kubernetes.io/projected/5fee0926-3042-4015-ad02-90f4306431ae-kube-api-access-k4dcf\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.675055 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ace060b-1d1d-4704-9655-78ab3599db2b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7fqgj\" (UID: \"1ace060b-1d1d-4704-9655-78ab3599db2b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.675145 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/308225d5-c374-4bb6-a967-020bf6e7173f-config\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.675227 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.675307 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d2h8\" (UniqueName: \"kubernetes.io/projected/13703c39-6eda-487f-9d53-509c6042d515-kube-api-access-9d2h8\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.675394 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x5qd\" (UniqueName: \"kubernetes.io/projected/64802881-57b2-4263-b5d8-f3c4c224c692-kube-api-access-6x5qd\") pod \"collect-profiles-29325000-fk4kf\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.675503 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/308225d5-c374-4bb6-a967-020bf6e7173f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.675599 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e697897f-0594-48da-967d-e429421b8fec-images\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.675695 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e697897f-0594-48da-967d-e429421b8fec-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.675783 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64802881-57b2-4263-b5d8-f3c4c224c692-secret-volume\") pod \"collect-profiles-29325000-fk4kf\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.675878 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gwrd\" (UniqueName: \"kubernetes.io/projected/8d6ca41b-f3b4-4bce-ad85-0150cfeb2362-kube-api-access-6gwrd\") pod \"catalog-operator-68c6474976-44xt7\" (UID: \"8d6ca41b-f3b4-4bce-ad85-0150cfeb2362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.675948 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-serving-cert\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676026 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-audit-policies\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676116 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676203 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676273 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-client-ca\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676347 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-encryption-config\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676418 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-config\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676508 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9gjr4\" (UID: \"f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676604 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676704 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ace060b-1d1d-4704-9655-78ab3599db2b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7fqgj\" (UID: \"1ace060b-1d1d-4704-9655-78ab3599db2b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676784 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwmjm\" (UniqueName: \"kubernetes.io/projected/f6977d44-d8ff-4d40-959f-024da50c53fe-kube-api-access-mwmjm\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676852 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13703c39-6eda-487f-9d53-509c6042d515-auth-proxy-config\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.676923 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e97026a-c5d6-4767-8ff5-54fad15f7a49-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tpww2\" (UID: \"6e97026a-c5d6-4767-8ff5-54fad15f7a49\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677000 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e697897f-0594-48da-967d-e429421b8fec-config\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677074 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-image-import-ca\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677174 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8d6ca41b-f3b4-4bce-ad85-0150cfeb2362-profile-collector-cert\") pod \"catalog-operator-68c6474976-44xt7\" (UID: \"8d6ca41b-f3b4-4bce-ad85-0150cfeb2362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677245 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5btf\" (UniqueName: \"kubernetes.io/projected/f02b12d5-48d5-48ed-81c7-db4e06189afe-kube-api-access-z5btf\") pod \"kube-storage-version-migrator-operator-b67b599dd-tw6g9\" (UID: \"f02b12d5-48d5-48ed-81c7-db4e06189afe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677315 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9gjr4\" (UID: \"f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677384 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-default-certificate\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677578 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/13703c39-6eda-487f-9d53-509c6042d515-machine-approver-tls\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677661 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-service-ca\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677734 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-oauth-serving-cert\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677815 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-etcd-serving-ca\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677896 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c83442ff-933c-4f99-aae8-522e4dc94199-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-877d7\" (UID: \"c83442ff-933c-4f99-aae8-522e4dc94199\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.677967 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f02b12d5-48d5-48ed-81c7-db4e06189afe-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tw6g9\" (UID: \"f02b12d5-48d5-48ed-81c7-db4e06189afe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674863 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.674826 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.678762 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.679429 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.682148 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.682651 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.686525 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.688148 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.690524 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.691075 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rjp5j"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.691383 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.691769 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6bp9m"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.691912 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.691922 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.692763 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.693617 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5q7j6"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.693696 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.693711 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.693648 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.696707 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vzs54"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.702006 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.702163 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.702255 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lqxss"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.706235 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.706515 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.708647 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.710693 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2p8qq"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.712167 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fcxkp"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.713366 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.714411 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gnzm2"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.715568 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.718403 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.718442 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.718452 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fzp9w"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.721881 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.722725 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.723519 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.724831 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.727416 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.729580 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mz2wd"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.729930 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bgrwl"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.730782 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.731890 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pk6zb"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.734423 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c9dcg"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.736150 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.738410 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.738762 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6bp9m"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.740368 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srw4g"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.740931 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.742167 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rjp5j"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.742530 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.743178 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.743979 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-54vkr"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.746044 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.746133 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-54vkr" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.746226 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ljmrj"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.753358 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.753425 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bgrwl"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.753571 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.756889 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-54vkr"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.761296 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.762132 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.764327 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ljmrj"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.765583 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-92b88"] Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.766183 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-92b88" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778715 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0013e14d-2163-45f2-8a98-dbe6805e40d0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rsgnn\" (UID: \"0013e14d-2163-45f2-8a98-dbe6805e40d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778751 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778776 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cbc4a55-551d-4314-bcb0-751f82313dc0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778799 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thvtp\" (UniqueName: \"kubernetes.io/projected/9d6eefb2-0ce4-4501-b443-d06c45efd41f-kube-api-access-thvtp\") pod \"openshift-controller-manager-operator-756b6f6bc6-fqn9d\" (UID: \"9d6eefb2-0ce4-4501-b443-d06c45efd41f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778823 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ace060b-1d1d-4704-9655-78ab3599db2b-config\") pod \"kube-controller-manager-operator-78b949d7b-7fqgj\" (UID: \"1ace060b-1d1d-4704-9655-78ab3599db2b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778844 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-etcd-client\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778867 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0aa8bab3-482e-41e8-800e-9962a4146194-metrics-tls\") pod \"dns-operator-744455d44c-c9dcg\" (UID: \"0aa8bab3-482e-41e8-800e-9962a4146194\") " pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778889 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5fee0926-3042-4015-ad02-90f4306431ae-images\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778912 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-metrics-certs\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778934 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-audit\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778957 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-serving-cert\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.778981 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779005 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64802881-57b2-4263-b5d8-f3c4c224c692-config-volume\") pod \"collect-profiles-29325000-fk4kf\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779029 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f02b12d5-48d5-48ed-81c7-db4e06189afe-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tw6g9\" (UID: \"f02b12d5-48d5-48ed-81c7-db4e06189afe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779061 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cbc4a55-551d-4314-bcb0-751f82313dc0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779086 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779144 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/308225d5-c374-4bb6-a967-020bf6e7173f-serving-cert\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779169 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779190 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2kjc\" (UniqueName: \"kubernetes.io/projected/010ef4ac-9542-4a76-a005-385439b1045c-kube-api-access-z2kjc\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779212 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c513f61-cee7-451f-b8a9-1dab425641a8-serving-cert\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779235 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf0aef3-c9be-4539-91bb-0a26d7d2a82e-serving-cert\") pod \"openshift-config-operator-7777fb866f-6qz8k\" (UID: \"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779259 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e97026a-c5d6-4767-8ff5-54fad15f7a49-config\") pod \"kube-apiserver-operator-766d6c64bb-tpww2\" (UID: \"6e97026a-c5d6-4767-8ff5-54fad15f7a49\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779282 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-serving-cert\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779306 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-client-ca\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779331 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtwjd\" (UniqueName: \"kubernetes.io/projected/c83442ff-933c-4f99-aae8-522e4dc94199-kube-api-access-vtwjd\") pod \"cluster-samples-operator-665b6dd947-877d7\" (UID: \"c83442ff-933c-4f99-aae8-522e4dc94199\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779352 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5fee0926-3042-4015-ad02-90f4306431ae-proxy-tls\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779375 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b5dg\" (UniqueName: \"kubernetes.io/projected/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-kube-api-access-8b5dg\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779399 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779419 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779442 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-config\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779465 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0499c819-4b67-4882-9354-f7b9d6d2adc7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r49hv\" (UID: \"0499c819-4b67-4882-9354-f7b9d6d2adc7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779492 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4dcf\" (UniqueName: \"kubernetes.io/projected/5fee0926-3042-4015-ad02-90f4306431ae-kube-api-access-k4dcf\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779535 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779568 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ace060b-1d1d-4704-9655-78ab3599db2b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7fqgj\" (UID: \"1ace060b-1d1d-4704-9655-78ab3599db2b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779589 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/308225d5-c374-4bb6-a967-020bf6e7173f-config\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779609 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/308225d5-c374-4bb6-a967-020bf6e7173f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779629 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e697897f-0594-48da-967d-e429421b8fec-images\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779652 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e697897f-0594-48da-967d-e429421b8fec-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779674 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d2h8\" (UniqueName: \"kubernetes.io/projected/13703c39-6eda-487f-9d53-509c6042d515-kube-api-access-9d2h8\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779696 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x5qd\" (UniqueName: \"kubernetes.io/projected/64802881-57b2-4263-b5d8-f3c4c224c692-kube-api-access-6x5qd\") pod \"collect-profiles-29325000-fk4kf\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779721 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gwrd\" (UniqueName: \"kubernetes.io/projected/8d6ca41b-f3b4-4bce-ad85-0150cfeb2362-kube-api-access-6gwrd\") pod \"catalog-operator-68c6474976-44xt7\" (UID: \"8d6ca41b-f3b4-4bce-ad85-0150cfeb2362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779731 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ace060b-1d1d-4704-9655-78ab3599db2b-config\") pod \"kube-controller-manager-operator-78b949d7b-7fqgj\" (UID: \"1ace060b-1d1d-4704-9655-78ab3599db2b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779743 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64802881-57b2-4263-b5d8-f3c4c224c692-secret-volume\") pod \"collect-profiles-29325000-fk4kf\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779767 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-serving-cert\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779766 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779801 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-audit-policies\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779824 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779847 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779869 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-client-ca\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779891 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779912 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-encryption-config\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779933 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-config\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779954 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9gjr4\" (UID: \"f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779976 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e97026a-c5d6-4767-8ff5-54fad15f7a49-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tpww2\" (UID: \"6e97026a-c5d6-4767-8ff5-54fad15f7a49\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.779999 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ace060b-1d1d-4704-9655-78ab3599db2b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7fqgj\" (UID: \"1ace060b-1d1d-4704-9655-78ab3599db2b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780023 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwmjm\" (UniqueName: \"kubernetes.io/projected/f6977d44-d8ff-4d40-959f-024da50c53fe-kube-api-access-mwmjm\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780048 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13703c39-6eda-487f-9d53-509c6042d515-auth-proxy-config\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780073 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e697897f-0594-48da-967d-e429421b8fec-config\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780113 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-image-import-ca\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780141 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8d6ca41b-f3b4-4bce-ad85-0150cfeb2362-profile-collector-cert\") pod \"catalog-operator-68c6474976-44xt7\" (UID: \"8d6ca41b-f3b4-4bce-ad85-0150cfeb2362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780163 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/13703c39-6eda-487f-9d53-509c6042d515-machine-approver-tls\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780188 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5btf\" (UniqueName: \"kubernetes.io/projected/f02b12d5-48d5-48ed-81c7-db4e06189afe-kube-api-access-z5btf\") pod \"kube-storage-version-migrator-operator-b67b599dd-tw6g9\" (UID: \"f02b12d5-48d5-48ed-81c7-db4e06189afe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780212 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9gjr4\" (UID: \"f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780233 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-default-certificate\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780254 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c83442ff-933c-4f99-aae8-522e4dc94199-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-877d7\" (UID: \"c83442ff-933c-4f99-aae8-522e4dc94199\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780276 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f02b12d5-48d5-48ed-81c7-db4e06189afe-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tw6g9\" (UID: \"f02b12d5-48d5-48ed-81c7-db4e06189afe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780298 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-service-ca\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780320 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-oauth-serving-cert\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780341 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-etcd-serving-ca\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780364 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-config\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780373 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cbc4a55-551d-4314-bcb0-751f82313dc0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780386 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6eefb2-0ce4-4501-b443-d06c45efd41f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fqn9d\" (UID: \"9d6eefb2-0ce4-4501-b443-d06c45efd41f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780418 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-trusted-ca-bundle\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780458 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfwwv\" (UniqueName: \"kubernetes.io/projected/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-kube-api-access-bfwwv\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780479 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-477ck\" (UniqueName: \"kubernetes.io/projected/3a0e6228-8a56-4b62-87f1-24eec9cffdd5-kube-api-access-477ck\") pod \"multus-admission-controller-857f4d67dd-fcxkp\" (UID: \"3a0e6228-8a56-4b62-87f1-24eec9cffdd5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780497 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-oauth-config\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780534 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftpjn\" (UniqueName: \"kubernetes.io/projected/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-kube-api-access-ftpjn\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780550 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7cf0aef3-c9be-4539-91bb-0a26d7d2a82e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6qz8k\" (UID: \"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780566 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13703c39-6eda-487f-9d53-509c6042d515-config\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780600 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-serving-cert\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780617 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8cbc4a55-551d-4314-bcb0-751f82313dc0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780638 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d6eefb2-0ce4-4501-b443-d06c45efd41f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fqn9d\" (UID: \"9d6eefb2-0ce4-4501-b443-d06c45efd41f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780686 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-audit-policies\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780707 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780723 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-node-pullsecrets\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780739 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-audit-dir\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780776 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0013e14d-2163-45f2-8a98-dbe6805e40d0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rsgnn\" (UID: \"0013e14d-2163-45f2-8a98-dbe6805e40d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780792 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb5t7\" (UniqueName: \"kubernetes.io/projected/0013e14d-2163-45f2-8a98-dbe6805e40d0-kube-api-access-pb5t7\") pod \"openshift-apiserver-operator-796bbdcf4f-rsgnn\" (UID: \"0013e14d-2163-45f2-8a98-dbe6805e40d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780811 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780847 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-console-config\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780864 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mnzr\" (UniqueName: \"kubernetes.io/projected/308225d5-c374-4bb6-a967-020bf6e7173f-kube-api-access-6mnzr\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780880 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780895 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pj24\" (UniqueName: \"kubernetes.io/projected/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-kube-api-access-8pj24\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780932 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-trusted-ca\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780949 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e97026a-c5d6-4767-8ff5-54fad15f7a49-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tpww2\" (UID: \"6e97026a-c5d6-4767-8ff5-54fad15f7a49\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780965 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-audit-dir\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781000 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/308225d5-c374-4bb6-a967-020bf6e7173f-service-ca-bundle\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781018 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/010ef4ac-9542-4a76-a005-385439b1045c-audit-dir\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781033 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-config\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781050 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9gjr4\" (UID: \"f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781085 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781113 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psnzs\" (UniqueName: \"kubernetes.io/projected/0499c819-4b67-4882-9354-f7b9d6d2adc7-kube-api-access-psnzs\") pod \"control-plane-machine-set-operator-78cbb6b69f-r49hv\" (UID: \"0499c819-4b67-4882-9354-f7b9d6d2adc7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781131 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-etcd-client\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781146 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq9l9\" (UniqueName: \"kubernetes.io/projected/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-kube-api-access-dq9l9\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781162 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-service-ca-bundle\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781179 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-stats-auth\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781196 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a0e6228-8a56-4b62-87f1-24eec9cffdd5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fcxkp\" (UID: \"3a0e6228-8a56-4b62-87f1-24eec9cffdd5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781213 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmp2b\" (UniqueName: \"kubernetes.io/projected/e697897f-0594-48da-967d-e429421b8fec-kube-api-access-gmp2b\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781229 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-encryption-config\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781559 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbbb5\" (UniqueName: \"kubernetes.io/projected/0aa8bab3-482e-41e8-800e-9962a4146194-kube-api-access-fbbb5\") pod \"dns-operator-744455d44c-c9dcg\" (UID: \"0aa8bab3-482e-41e8-800e-9962a4146194\") " pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781584 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccdcn\" (UniqueName: \"kubernetes.io/projected/7cf0aef3-c9be-4539-91bb-0a26d7d2a82e-kube-api-access-ccdcn\") pod \"openshift-config-operator-7777fb866f-6qz8k\" (UID: \"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781602 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bjvx\" (UniqueName: \"kubernetes.io/projected/8cbc4a55-551d-4314-bcb0-751f82313dc0-kube-api-access-2bjvx\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781617 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5fee0926-3042-4015-ad02-90f4306431ae-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781633 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87t6p\" (UniqueName: \"kubernetes.io/projected/40e1f28c-6d64-4fa1-b554-507ff389f115-kube-api-access-87t6p\") pod \"downloads-7954f5f757-5q7j6\" (UID: \"40e1f28c-6d64-4fa1-b554-507ff389f115\") " pod="openshift-console/downloads-7954f5f757-5q7j6" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781652 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781669 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfzzt\" (UniqueName: \"kubernetes.io/projected/8c513f61-cee7-451f-b8a9-1dab425641a8-kube-api-access-mfzzt\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781685 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-serving-cert\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781700 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8d6ca41b-f3b4-4bce-ad85-0150cfeb2362-srv-cert\") pod \"catalog-operator-68c6474976-44xt7\" (UID: \"8d6ca41b-f3b4-4bce-ad85-0150cfeb2362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.781907 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.782117 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5fee0926-3042-4015-ad02-90f4306431ae-images\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.782389 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-client-ca\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.782595 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-trusted-ca-bundle\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.782754 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.783444 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0013e14d-2163-45f2-8a98-dbe6805e40d0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rsgnn\" (UID: \"0013e14d-2163-45f2-8a98-dbe6805e40d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.783760 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-audit\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.784003 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-config\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.780509 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-client-ca\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.785060 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5fee0926-3042-4015-ad02-90f4306431ae-proxy-tls\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.785063 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7cf0aef3-c9be-4539-91bb-0a26d7d2a82e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6qz8k\" (UID: \"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.785683 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-etcd-client\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.785687 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13703c39-6eda-487f-9d53-509c6042d515-config\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.785963 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13703c39-6eda-487f-9d53-509c6042d515-auth-proxy-config\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.786229 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cbc4a55-551d-4314-bcb0-751f82313dc0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.786695 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e697897f-0594-48da-967d-e429421b8fec-config\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.787464 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-image-import-ca\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.788613 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-config\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.790139 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-service-ca\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.790350 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.790470 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/308225d5-c374-4bb6-a967-020bf6e7173f-config\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.790988 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0499c819-4b67-4882-9354-f7b9d6d2adc7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r49hv\" (UID: \"0499c819-4b67-4882-9354-f7b9d6d2adc7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.791310 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.791328 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/308225d5-c374-4bb6-a967-020bf6e7173f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.791714 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-default-certificate\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.792136 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.792495 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e697897f-0594-48da-967d-e429421b8fec-images\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.792531 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-oauth-serving-cert\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.792713 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.792873 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-audit-policies\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.793067 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-etcd-serving-ca\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.793121 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ace060b-1d1d-4704-9655-78ab3599db2b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7fqgj\" (UID: \"1ace060b-1d1d-4704-9655-78ab3599db2b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.793487 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.793592 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/308225d5-c374-4bb6-a967-020bf6e7173f-serving-cert\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.793975 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/13703c39-6eda-487f-9d53-509c6042d515-machine-approver-tls\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.794094 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-config\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.794406 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e697897f-0594-48da-967d-e429421b8fec-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.794443 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c83442ff-933c-4f99-aae8-522e4dc94199-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-877d7\" (UID: \"c83442ff-933c-4f99-aae8-522e4dc94199\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.795482 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.796593 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5fee0926-3042-4015-ad02-90f4306431ae-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.796685 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-node-pullsecrets\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.796739 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-audit-dir\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.797091 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.798003 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-audit-policies\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.798439 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.799507 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-service-ca-bundle\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.799914 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf0aef3-c9be-4539-91bb-0a26d7d2a82e-serving-cert\") pod \"openshift-config-operator-7777fb866f-6qz8k\" (UID: \"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.800087 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.800728 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.800757 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c513f61-cee7-451f-b8a9-1dab425641a8-serving-cert\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.800820 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-oauth-config\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.800861 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-serving-cert\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.802648 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-encryption-config\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.802961 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-serving-cert\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.803036 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/010ef4ac-9542-4a76-a005-385439b1045c-audit-dir\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.803090 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-audit-dir\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.803077 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-console-config\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.803223 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.803476 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-trusted-ca\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.805577 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.805615 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a0e6228-8a56-4b62-87f1-24eec9cffdd5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fcxkp\" (UID: \"3a0e6228-8a56-4b62-87f1-24eec9cffdd5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.805656 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-encryption-config\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.806073 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-stats-auth\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.806167 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-serving-cert\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.806393 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0013e14d-2163-45f2-8a98-dbe6805e40d0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rsgnn\" (UID: \"0013e14d-2163-45f2-8a98-dbe6805e40d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.808487 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.808642 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-etcd-client\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.808788 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-serving-cert\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.808899 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-serving-cert\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.809530 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.816008 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-metrics-certs\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.822317 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.834629 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/308225d5-c374-4bb6-a967-020bf6e7173f-service-ca-bundle\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.838539 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-config\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.842791 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.882636 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.891340 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6eefb2-0ce4-4501-b443-d06c45efd41f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-fqn9d\" (UID: \"9d6eefb2-0ce4-4501-b443-d06c45efd41f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.902909 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.922324 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d6eefb2-0ce4-4501-b443-d06c45efd41f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-fqn9d\" (UID: \"9d6eefb2-0ce4-4501-b443-d06c45efd41f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.922495 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.942730 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.962397 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 03 14:03:10 crc kubenswrapper[4636]: I1003 14:03:10.982254 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.002748 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.023422 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.042909 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.047621 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64802881-57b2-4263-b5d8-f3c4c224c692-secret-volume\") pod \"collect-profiles-29325000-fk4kf\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.055555 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8d6ca41b-f3b4-4bce-ad85-0150cfeb2362-profile-collector-cert\") pod \"catalog-operator-68c6474976-44xt7\" (UID: \"8d6ca41b-f3b4-4bce-ad85-0150cfeb2362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.063645 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.065426 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64802881-57b2-4263-b5d8-f3c4c224c692-config-volume\") pod \"collect-profiles-29325000-fk4kf\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.083949 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.091511 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.103640 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.114610 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e97026a-c5d6-4767-8ff5-54fad15f7a49-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tpww2\" (UID: \"6e97026a-c5d6-4767-8ff5-54fad15f7a49\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.123086 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.125803 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e97026a-c5d6-4767-8ff5-54fad15f7a49-config\") pod \"kube-apiserver-operator-766d6c64bb-tpww2\" (UID: \"6e97026a-c5d6-4767-8ff5-54fad15f7a49\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.143632 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.163053 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.184036 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.197132 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8d6ca41b-f3b4-4bce-ad85-0150cfeb2362-srv-cert\") pod \"catalog-operator-68c6474976-44xt7\" (UID: \"8d6ca41b-f3b4-4bce-ad85-0150cfeb2362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.202683 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.222935 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.229405 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9gjr4\" (UID: \"f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.243266 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.244352 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9gjr4\" (UID: \"f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.263271 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.283407 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.302519 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.323135 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.335282 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0aa8bab3-482e-41e8-800e-9962a4146194-metrics-tls\") pod \"dns-operator-744455d44c-c9dcg\" (UID: \"0aa8bab3-482e-41e8-800e-9962a4146194\") " pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.343403 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.408851 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.408930 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.422983 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.446539 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.462797 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.473826 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f02b12d5-48d5-48ed-81c7-db4e06189afe-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tw6g9\" (UID: \"f02b12d5-48d5-48ed-81c7-db4e06189afe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.482687 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.485799 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f02b12d5-48d5-48ed-81c7-db4e06189afe-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tw6g9\" (UID: \"f02b12d5-48d5-48ed-81c7-db4e06189afe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.503574 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.523059 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.543352 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.562198 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.582658 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.603437 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.623239 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.643502 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.662940 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.680807 4636 request.go:700] Waited for 1.00438854s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.682640 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.702652 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.722762 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.742682 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.762924 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.782831 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.802810 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.822901 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.842879 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.862393 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.883766 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.902562 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.923442 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.942541 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.963008 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 03 14:03:11 crc kubenswrapper[4636]: I1003 14:03:11.983515 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.008661 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.023741 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.042464 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.062936 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.090758 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.103268 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.122163 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.142502 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.162868 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.183081 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.201807 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.222449 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.243404 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.263118 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.282928 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.303348 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.322961 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.341897 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.362883 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.383043 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.402534 4636 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.422057 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.442853 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.462826 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.482293 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.502696 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.536597 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thvtp\" (UniqueName: \"kubernetes.io/projected/9d6eefb2-0ce4-4501-b443-d06c45efd41f-kube-api-access-thvtp\") pod \"openshift-controller-manager-operator-756b6f6bc6-fqn9d\" (UID: \"9d6eefb2-0ce4-4501-b443-d06c45efd41f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.557120 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtwjd\" (UniqueName: \"kubernetes.io/projected/c83442ff-933c-4f99-aae8-522e4dc94199-kube-api-access-vtwjd\") pod \"cluster-samples-operator-665b6dd947-877d7\" (UID: \"c83442ff-933c-4f99-aae8-522e4dc94199\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.573738 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.576630 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b5dg\" (UniqueName: \"kubernetes.io/projected/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-kube-api-access-8b5dg\") pod \"controller-manager-879f6c89f-mz2wd\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.597652 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfwwv\" (UniqueName: \"kubernetes.io/projected/ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76-kube-api-access-bfwwv\") pod \"router-default-5444994796-zvvrp\" (UID: \"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76\") " pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.617143 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-477ck\" (UniqueName: \"kubernetes.io/projected/3a0e6228-8a56-4b62-87f1-24eec9cffdd5-kube-api-access-477ck\") pod \"multus-admission-controller-857f4d67dd-fcxkp\" (UID: \"3a0e6228-8a56-4b62-87f1-24eec9cffdd5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.638527 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9gjr4\" (UID: \"f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.658196 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e97026a-c5d6-4767-8ff5-54fad15f7a49-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tpww2\" (UID: \"6e97026a-c5d6-4767-8ff5-54fad15f7a49\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.675371 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.682311 4636 request.go:700] Waited for 1.896973259s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.691475 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.702045 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwmjm\" (UniqueName: \"kubernetes.io/projected/f6977d44-d8ff-4d40-959f-024da50c53fe-kube-api-access-mwmjm\") pod \"console-f9d7485db-lqxss\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.708152 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftpjn\" (UniqueName: \"kubernetes.io/projected/0777cf5e-3bec-495f-8e8c-5d25b7a7b46b-kube-api-access-ftpjn\") pod \"apiserver-7bbb656c7d-zcndt\" (UID: \"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.719866 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.724575 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5btf\" (UniqueName: \"kubernetes.io/projected/f02b12d5-48d5-48ed-81c7-db4e06189afe-kube-api-access-z5btf\") pod \"kube-storage-version-migrator-operator-b67b599dd-tw6g9\" (UID: \"f02b12d5-48d5-48ed-81c7-db4e06189afe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.740696 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4dcf\" (UniqueName: \"kubernetes.io/projected/5fee0926-3042-4015-ad02-90f4306431ae-kube-api-access-k4dcf\") pod \"machine-config-operator-74547568cd-pb9vh\" (UID: \"5fee0926-3042-4015-ad02-90f4306431ae\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.753504 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.762767 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ace060b-1d1d-4704-9655-78ab3599db2b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7fqgj\" (UID: \"1ace060b-1d1d-4704-9655-78ab3599db2b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.779129 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8cbc4a55-551d-4314-bcb0-751f82313dc0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.808688 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2kjc\" (UniqueName: \"kubernetes.io/projected/010ef4ac-9542-4a76-a005-385439b1045c-kube-api-access-z2kjc\") pod \"oauth-openshift-558db77b4-srw4g\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.814201 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d"] Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.850216 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x5qd\" (UniqueName: \"kubernetes.io/projected/64802881-57b2-4263-b5d8-f3c4c224c692-kube-api-access-6x5qd\") pod \"collect-profiles-29325000-fk4kf\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.850362 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.854061 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.858509 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d2h8\" (UniqueName: \"kubernetes.io/projected/13703c39-6eda-487f-9d53-509c6042d515-kube-api-access-9d2h8\") pod \"machine-approver-56656f9798-qfbfg\" (UID: \"13703c39-6eda-487f-9d53-509c6042d515\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.864551 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.872581 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gwrd\" (UniqueName: \"kubernetes.io/projected/8d6ca41b-f3b4-4bce-ad85-0150cfeb2362-kube-api-access-6gwrd\") pod \"catalog-operator-68c6474976-44xt7\" (UID: \"8d6ca41b-f3b4-4bce-ad85-0150cfeb2362\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.897212 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.900586 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccdcn\" (UniqueName: \"kubernetes.io/projected/7cf0aef3-c9be-4539-91bb-0a26d7d2a82e-kube-api-access-ccdcn\") pod \"openshift-config-operator-7777fb866f-6qz8k\" (UID: \"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.900642 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbbb5\" (UniqueName: \"kubernetes.io/projected/0aa8bab3-482e-41e8-800e-9962a4146194-kube-api-access-fbbb5\") pod \"dns-operator-744455d44c-c9dcg\" (UID: \"0aa8bab3-482e-41e8-800e-9962a4146194\") " pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.936246 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bjvx\" (UniqueName: \"kubernetes.io/projected/8cbc4a55-551d-4314-bcb0-751f82313dc0-kube-api-access-2bjvx\") pod \"cluster-image-registry-operator-dc59b4c8b-j8z62\" (UID: \"8cbc4a55-551d-4314-bcb0-751f82313dc0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.936847 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2"] Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.939406 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psnzs\" (UniqueName: \"kubernetes.io/projected/0499c819-4b67-4882-9354-f7b9d6d2adc7-kube-api-access-psnzs\") pod \"control-plane-machine-set-operator-78cbb6b69f-r49hv\" (UID: \"0499c819-4b67-4882-9354-f7b9d6d2adc7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.952883 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.959077 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87t6p\" (UniqueName: \"kubernetes.io/projected/40e1f28c-6d64-4fa1-b554-507ff389f115-kube-api-access-87t6p\") pod \"downloads-7954f5f757-5q7j6\" (UID: \"40e1f28c-6d64-4fa1-b554-507ff389f115\") " pod="openshift-console/downloads-7954f5f757-5q7j6" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.961974 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.968481 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:12 crc kubenswrapper[4636]: W1003 14:03:12.969138 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddcdaf13_a4b8_43c6_9e69_2fd8d8594f76.slice/crio-dcd9c5787efcae285db792d467b08ff66d100780ebc29ab9123a66677461b103 WatchSource:0}: Error finding container dcd9c5787efcae285db792d467b08ff66d100780ebc29ab9123a66677461b103: Status 404 returned error can't find the container with id dcd9c5787efcae285db792d467b08ff66d100780ebc29ab9123a66677461b103 Oct 03 14:03:12 crc kubenswrapper[4636]: W1003 14:03:12.970856 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e97026a_c5d6_4767_8ff5_54fad15f7a49.slice/crio-ecd147587d0f39f4b675fa879346e8758764a2bb34cdfdc59e15cde79bd8fb0c WatchSource:0}: Error finding container ecd147587d0f39f4b675fa879346e8758764a2bb34cdfdc59e15cde79bd8fb0c: Status 404 returned error can't find the container with id ecd147587d0f39f4b675fa879346e8758764a2bb34cdfdc59e15cde79bd8fb0c Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.979720 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq9l9\" (UniqueName: \"kubernetes.io/projected/55eb43d6-a42c-4b21-a8e9-82d2c75ee839-kube-api-access-dq9l9\") pod \"console-operator-58897d9998-fzp9w\" (UID: \"55eb43d6-a42c-4b21-a8e9-82d2c75ee839\") " pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.979731 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.995305 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" Oct 03 14:03:12 crc kubenswrapper[4636]: I1003 14:03:12.996283 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5q7j6" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.004374 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfzzt\" (UniqueName: \"kubernetes.io/projected/8c513f61-cee7-451f-b8a9-1dab425641a8-kube-api-access-mfzzt\") pod \"route-controller-manager-6576b87f9c-pc4j4\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.006363 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.011213 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.029434 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb5t7\" (UniqueName: \"kubernetes.io/projected/0013e14d-2163-45f2-8a98-dbe6805e40d0-kube-api-access-pb5t7\") pod \"openshift-apiserver-operator-796bbdcf4f-rsgnn\" (UID: \"0013e14d-2163-45f2-8a98-dbe6805e40d0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.038339 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmp2b\" (UniqueName: \"kubernetes.io/projected/e697897f-0594-48da-967d-e429421b8fec-kube-api-access-gmp2b\") pod \"machine-api-operator-5694c8668f-qzkgg\" (UID: \"e697897f-0594-48da-967d-e429421b8fec\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.072854 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mnzr\" (UniqueName: \"kubernetes.io/projected/308225d5-c374-4bb6-a967-020bf6e7173f-kube-api-access-6mnzr\") pod \"authentication-operator-69f744f599-vzs54\" (UID: \"308225d5-c374-4bb6-a967-020bf6e7173f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.077417 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.078832 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lqxss"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.090302 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.093032 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pj24\" (UniqueName: \"kubernetes.io/projected/62f6cc0c-eb9d-44ef-8ce7-93a6148c3264-kube-api-access-8pj24\") pod \"apiserver-76f77b778f-2p8qq\" (UID: \"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264\") " pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.102485 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.113607 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.124062 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.133389 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-tls\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.133435 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-certificates\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.133458 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-bound-sa-token\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.133475 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh6fd\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-kube-api-access-vh6fd\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.133497 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-trusted-ca\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.133527 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.133551 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.133604 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: E1003 14:03:13.133875 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:13.633864809 +0000 UTC m=+143.492591056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.134127 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:13 crc kubenswrapper[4636]: W1003 14:03:13.136524 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6977d44_d8ff_4d40_959f_024da50c53fe.slice/crio-b2b9378e0b9e22502db48d84622387f275e08bfe44fb26b32c2339a2196d4115 WatchSource:0}: Error finding container b2b9378e0b9e22502db48d84622387f275e08bfe44fb26b32c2339a2196d4115: Status 404 returned error can't find the container with id b2b9378e0b9e22502db48d84622387f275e08bfe44fb26b32c2339a2196d4115 Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.141367 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.149880 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.178945 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.204813 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.224351 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.232499 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.236302 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:13 crc kubenswrapper[4636]: E1003 14:03:13.236590 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:13.736563407 +0000 UTC m=+143.595289654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.236667 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab96097e-4edb-4abf-a7b4-c05a20f86659-srv-cert\") pod \"olm-operator-6b444d44fb-lvb2q\" (UID: \"ab96097e-4edb-4abf-a7b4-c05a20f86659\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.236734 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ebf1ddd-591b-408d-914e-eb316cabd08c-serving-cert\") pod \"service-ca-operator-777779d784-ds4fg\" (UID: \"5ebf1ddd-591b-408d-914e-eb316cabd08c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.236753 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2154c02b-316a-4776-944f-734586e04489-config-volume\") pod \"dns-default-bgrwl\" (UID: \"2154c02b-316a-4776-944f-734586e04489\") " pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.236778 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e8c328d2-8ab9-446f-9cb0-ebc873785d90-certs\") pod \"machine-config-server-92b88\" (UID: \"e8c328d2-8ab9-446f-9cb0-ebc873785d90\") " pod="openshift-machine-config-operator/machine-config-server-92b88" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.236824 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.236893 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.236910 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzjns\" (UniqueName: \"kubernetes.io/projected/29b5d949-e3d0-4f7b-9f11-5a37beb5ead2-kube-api-access-vzjns\") pod \"migrator-59844c95c7-htgvn\" (UID: \"29b5d949-e3d0-4f7b-9f11-5a37beb5ead2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.236939 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0f4e9ccb-4580-4ea7-a7bd-181bab6530c0-signing-cabundle\") pod \"service-ca-9c57cc56f-6bp9m\" (UID: \"0f4e9ccb-4580-4ea7-a7bd-181bab6530c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.237474 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-webhook-cert\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.237543 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff63da45-76c4-4b40-a65b-177b4bfa9feb-proxy-tls\") pod \"machine-config-controller-84d6567774-hndq5\" (UID: \"ff63da45-76c4-4b40-a65b-177b4bfa9feb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.237587 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mz2wd"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.239185 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bc7cfea-82a6-48a1-995a-b473d672a62d-trusted-ca\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.239240 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq8dk\" (UniqueName: \"kubernetes.io/projected/5ebf1ddd-591b-408d-914e-eb316cabd08c-kube-api-access-hq8dk\") pod \"service-ca-operator-777779d784-ds4fg\" (UID: \"5ebf1ddd-591b-408d-914e-eb316cabd08c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.239261 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rjp5j\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.240158 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c9fc77b-189a-4fbc-9449-491f7a1700b9-etcd-client\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.240202 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2c7t\" (UniqueName: \"kubernetes.io/projected/2154c02b-316a-4776-944f-734586e04489-kube-api-access-w2c7t\") pod \"dns-default-bgrwl\" (UID: \"2154c02b-316a-4776-944f-734586e04489\") " pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.240249 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gt6v\" (UniqueName: \"kubernetes.io/projected/1c9fc77b-189a-4fbc-9449-491f7a1700b9-kube-api-access-2gt6v\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.240930 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dwjx\" (UniqueName: \"kubernetes.io/projected/8bc7cfea-82a6-48a1-995a-b473d672a62d-kube-api-access-6dwjx\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.241709 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-tmpfs\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.241928 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.242083 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.242220 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-registration-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.242247 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-plugins-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.242651 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-mountpoint-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.242676 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2154c02b-316a-4776-944f-734586e04489-metrics-tls\") pod \"dns-default-bgrwl\" (UID: \"2154c02b-316a-4776-944f-734586e04489\") " pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.242699 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tj9j\" (UniqueName: \"kubernetes.io/projected/45b4ca9d-0850-4212-a6d4-9ff374d3e982-kube-api-access-4tj9j\") pod \"ingress-canary-54vkr\" (UID: \"45b4ca9d-0850-4212-a6d4-9ff374d3e982\") " pod="openshift-ingress-canary/ingress-canary-54vkr" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.242787 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-apiservice-cert\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.242901 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-csi-data-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.242921 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9fc77b-189a-4fbc-9449-491f7a1700b9-config\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.242946 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e8c328d2-8ab9-446f-9cb0-ebc873785d90-node-bootstrap-token\") pod \"machine-config-server-92b88\" (UID: \"e8c328d2-8ab9-446f-9cb0-ebc873785d90\") " pod="openshift-machine-config-operator/machine-config-server-92b88" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.243199 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.243226 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rksq8\" (UniqueName: \"kubernetes.io/projected/005bbea4-d7a5-413a-9c19-d9503a370566-kube-api-access-rksq8\") pod \"package-server-manager-789f6589d5-gjrbj\" (UID: \"005bbea4-d7a5-413a-9c19-d9503a370566\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.243251 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-socket-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.243265 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c9fc77b-189a-4fbc-9449-491f7a1700b9-etcd-service-ca\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.243280 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ebf1ddd-591b-408d-914e-eb316cabd08c-config\") pod \"service-ca-operator-777779d784-ds4fg\" (UID: \"5ebf1ddd-591b-408d-914e-eb316cabd08c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.243401 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xh4x\" (UniqueName: \"kubernetes.io/projected/72d5c706-f441-4f26-99b0-c8979fb0c3f3-kube-api-access-5xh4x\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.243418 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab96097e-4edb-4abf-a7b4-c05a20f86659-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lvb2q\" (UID: \"ab96097e-4edb-4abf-a7b4-c05a20f86659\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.243434 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff63da45-76c4-4b40-a65b-177b4bfa9feb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hndq5\" (UID: \"ff63da45-76c4-4b40-a65b-177b4bfa9feb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.243450 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-tls\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.243475 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvjh2\" (UniqueName: \"kubernetes.io/projected/ff63da45-76c4-4b40-a65b-177b4bfa9feb-kube-api-access-pvjh2\") pod \"machine-config-controller-84d6567774-hndq5\" (UID: \"ff63da45-76c4-4b40-a65b-177b4bfa9feb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.243508 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1c9fc77b-189a-4fbc-9449-491f7a1700b9-etcd-ca\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: E1003 14:03:13.244470 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:13.744457158 +0000 UTC m=+143.603183405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.244815 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4xqm\" (UniqueName: \"kubernetes.io/projected/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-kube-api-access-h4xqm\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.244886 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-certificates\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.244935 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lt6w\" (UniqueName: \"kubernetes.io/projected/0f4e9ccb-4580-4ea7-a7bd-181bab6530c0-kube-api-access-9lt6w\") pod \"service-ca-9c57cc56f-6bp9m\" (UID: \"0f4e9ccb-4580-4ea7-a7bd-181bab6530c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.244954 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c9fc77b-189a-4fbc-9449-491f7a1700b9-serving-cert\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.245587 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8bc7cfea-82a6-48a1-995a-b473d672a62d-metrics-tls\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.246515 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rjp5j\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.246567 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shpqx\" (UniqueName: \"kubernetes.io/projected/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-kube-api-access-shpqx\") pod \"marketplace-operator-79b997595-rjp5j\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.246607 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-bound-sa-token\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.246655 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/005bbea4-d7a5-413a-9c19-d9503a370566-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gjrbj\" (UID: \"005bbea4-d7a5-413a-9c19-d9503a370566\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.246709 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nng2k\" (UniqueName: \"kubernetes.io/projected/ab96097e-4edb-4abf-a7b4-c05a20f86659-kube-api-access-nng2k\") pod \"olm-operator-6b444d44fb-lvb2q\" (UID: \"ab96097e-4edb-4abf-a7b4-c05a20f86659\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.247855 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh6fd\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-kube-api-access-vh6fd\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.247909 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bc7cfea-82a6-48a1-995a-b473d672a62d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.249934 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0f4e9ccb-4580-4ea7-a7bd-181bab6530c0-signing-key\") pod \"service-ca-9c57cc56f-6bp9m\" (UID: \"0f4e9ccb-4580-4ea7-a7bd-181bab6530c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.250213 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btz2b\" (UniqueName: \"kubernetes.io/projected/e8c328d2-8ab9-446f-9cb0-ebc873785d90-kube-api-access-btz2b\") pod \"machine-config-server-92b88\" (UID: \"e8c328d2-8ab9-446f-9cb0-ebc873785d90\") " pod="openshift-machine-config-operator/machine-config-server-92b88" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.250248 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45b4ca9d-0850-4212-a6d4-9ff374d3e982-cert\") pod \"ingress-canary-54vkr\" (UID: \"45b4ca9d-0850-4212-a6d4-9ff374d3e982\") " pod="openshift-ingress-canary/ingress-canary-54vkr" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.250282 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-trusted-ca\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.251041 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-tls\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.251568 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-trusted-ca\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.255166 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.263307 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-certificates\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.291390 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fcxkp"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.309477 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-bound-sa-token\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.320935 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh6fd\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-kube-api-access-vh6fd\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.325757 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.336340 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5q7j6"] Oct 03 14:03:13 crc kubenswrapper[4636]: W1003 14:03:13.351172 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf83d7f1b_5df1_40e0_ac89_3cfca1fd0b23.slice/crio-9190e3f0a541e06314c80747ac9a60f75b5fcd6ac59da8df18a36fb1cfbc6924 WatchSource:0}: Error finding container 9190e3f0a541e06314c80747ac9a60f75b5fcd6ac59da8df18a36fb1cfbc6924: Status 404 returned error can't find the container with id 9190e3f0a541e06314c80747ac9a60f75b5fcd6ac59da8df18a36fb1cfbc6924 Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352229 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352468 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzjns\" (UniqueName: \"kubernetes.io/projected/29b5d949-e3d0-4f7b-9f11-5a37beb5ead2-kube-api-access-vzjns\") pod \"migrator-59844c95c7-htgvn\" (UID: \"29b5d949-e3d0-4f7b-9f11-5a37beb5ead2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352503 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0f4e9ccb-4580-4ea7-a7bd-181bab6530c0-signing-cabundle\") pod \"service-ca-9c57cc56f-6bp9m\" (UID: \"0f4e9ccb-4580-4ea7-a7bd-181bab6530c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352524 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-webhook-cert\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352550 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff63da45-76c4-4b40-a65b-177b4bfa9feb-proxy-tls\") pod \"machine-config-controller-84d6567774-hndq5\" (UID: \"ff63da45-76c4-4b40-a65b-177b4bfa9feb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352569 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bc7cfea-82a6-48a1-995a-b473d672a62d-trusted-ca\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352593 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq8dk\" (UniqueName: \"kubernetes.io/projected/5ebf1ddd-591b-408d-914e-eb316cabd08c-kube-api-access-hq8dk\") pod \"service-ca-operator-777779d784-ds4fg\" (UID: \"5ebf1ddd-591b-408d-914e-eb316cabd08c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352615 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rjp5j\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352642 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c9fc77b-189a-4fbc-9449-491f7a1700b9-etcd-client\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352664 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2c7t\" (UniqueName: \"kubernetes.io/projected/2154c02b-316a-4776-944f-734586e04489-kube-api-access-w2c7t\") pod \"dns-default-bgrwl\" (UID: \"2154c02b-316a-4776-944f-734586e04489\") " pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352687 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gt6v\" (UniqueName: \"kubernetes.io/projected/1c9fc77b-189a-4fbc-9449-491f7a1700b9-kube-api-access-2gt6v\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352713 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dwjx\" (UniqueName: \"kubernetes.io/projected/8bc7cfea-82a6-48a1-995a-b473d672a62d-kube-api-access-6dwjx\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352736 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-tmpfs\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352757 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-plugins-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352777 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-registration-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352808 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-mountpoint-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352827 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2154c02b-316a-4776-944f-734586e04489-metrics-tls\") pod \"dns-default-bgrwl\" (UID: \"2154c02b-316a-4776-944f-734586e04489\") " pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352849 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tj9j\" (UniqueName: \"kubernetes.io/projected/45b4ca9d-0850-4212-a6d4-9ff374d3e982-kube-api-access-4tj9j\") pod \"ingress-canary-54vkr\" (UID: \"45b4ca9d-0850-4212-a6d4-9ff374d3e982\") " pod="openshift-ingress-canary/ingress-canary-54vkr" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352872 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-apiservice-cert\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352894 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-csi-data-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352916 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9fc77b-189a-4fbc-9449-491f7a1700b9-config\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352948 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e8c328d2-8ab9-446f-9cb0-ebc873785d90-node-bootstrap-token\") pod \"machine-config-server-92b88\" (UID: \"e8c328d2-8ab9-446f-9cb0-ebc873785d90\") " pod="openshift-machine-config-operator/machine-config-server-92b88" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.352980 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rksq8\" (UniqueName: \"kubernetes.io/projected/005bbea4-d7a5-413a-9c19-d9503a370566-kube-api-access-rksq8\") pod \"package-server-manager-789f6589d5-gjrbj\" (UID: \"005bbea4-d7a5-413a-9c19-d9503a370566\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.353000 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ebf1ddd-591b-408d-914e-eb316cabd08c-config\") pod \"service-ca-operator-777779d784-ds4fg\" (UID: \"5ebf1ddd-591b-408d-914e-eb316cabd08c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.353028 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-socket-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.353047 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c9fc77b-189a-4fbc-9449-491f7a1700b9-etcd-service-ca\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.353066 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff63da45-76c4-4b40-a65b-177b4bfa9feb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hndq5\" (UID: \"ff63da45-76c4-4b40-a65b-177b4bfa9feb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.353089 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xh4x\" (UniqueName: \"kubernetes.io/projected/72d5c706-f441-4f26-99b0-c8979fb0c3f3-kube-api-access-5xh4x\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.353820 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab96097e-4edb-4abf-a7b4-c05a20f86659-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lvb2q\" (UID: \"ab96097e-4edb-4abf-a7b4-c05a20f86659\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:13 crc kubenswrapper[4636]: E1003 14:03:13.354466 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:13.854441321 +0000 UTC m=+143.713167648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.354917 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-tmpfs\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.355044 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-registration-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.355456 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-mountpoint-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.355822 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-plugins-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.355936 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-csi-data-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.357073 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvjh2\" (UniqueName: \"kubernetes.io/projected/ff63da45-76c4-4b40-a65b-177b4bfa9feb-kube-api-access-pvjh2\") pod \"machine-config-controller-84d6567774-hndq5\" (UID: \"ff63da45-76c4-4b40-a65b-177b4bfa9feb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.358027 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0f4e9ccb-4580-4ea7-a7bd-181bab6530c0-signing-cabundle\") pod \"service-ca-9c57cc56f-6bp9m\" (UID: \"0f4e9ccb-4580-4ea7-a7bd-181bab6530c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.358075 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ebf1ddd-591b-408d-914e-eb316cabd08c-config\") pod \"service-ca-operator-777779d784-ds4fg\" (UID: \"5ebf1ddd-591b-408d-914e-eb316cabd08c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.358147 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/72d5c706-f441-4f26-99b0-c8979fb0c3f3-socket-dir\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.358544 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c9fc77b-189a-4fbc-9449-491f7a1700b9-etcd-service-ca\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.360193 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff63da45-76c4-4b40-a65b-177b4bfa9feb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hndq5\" (UID: \"ff63da45-76c4-4b40-a65b-177b4bfa9feb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.361627 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e8c328d2-8ab9-446f-9cb0-ebc873785d90-node-bootstrap-token\") pod \"machine-config-server-92b88\" (UID: \"e8c328d2-8ab9-446f-9cb0-ebc873785d90\") " pod="openshift-machine-config-operator/machine-config-server-92b88" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.363045 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bc7cfea-82a6-48a1-995a-b473d672a62d-trusted-ca\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.363378 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1c9fc77b-189a-4fbc-9449-491f7a1700b9-etcd-ca\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.363824 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1c9fc77b-189a-4fbc-9449-491f7a1700b9-etcd-ca\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.363896 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4xqm\" (UniqueName: \"kubernetes.io/projected/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-kube-api-access-h4xqm\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.363930 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c9fc77b-189a-4fbc-9449-491f7a1700b9-serving-cert\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.363952 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lt6w\" (UniqueName: \"kubernetes.io/projected/0f4e9ccb-4580-4ea7-a7bd-181bab6530c0-kube-api-access-9lt6w\") pod \"service-ca-9c57cc56f-6bp9m\" (UID: \"0f4e9ccb-4580-4ea7-a7bd-181bab6530c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.363992 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8bc7cfea-82a6-48a1-995a-b473d672a62d-metrics-tls\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.364013 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rjp5j\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.364051 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shpqx\" (UniqueName: \"kubernetes.io/projected/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-kube-api-access-shpqx\") pod \"marketplace-operator-79b997595-rjp5j\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.364073 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/005bbea4-d7a5-413a-9c19-d9503a370566-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gjrbj\" (UID: \"005bbea4-d7a5-413a-9c19-d9503a370566\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.366445 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rjp5j\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.369541 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nng2k\" (UniqueName: \"kubernetes.io/projected/ab96097e-4edb-4abf-a7b4-c05a20f86659-kube-api-access-nng2k\") pod \"olm-operator-6b444d44fb-lvb2q\" (UID: \"ab96097e-4edb-4abf-a7b4-c05a20f86659\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.369718 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bc7cfea-82a6-48a1-995a-b473d672a62d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.369787 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2154c02b-316a-4776-944f-734586e04489-metrics-tls\") pod \"dns-default-bgrwl\" (UID: \"2154c02b-316a-4776-944f-734586e04489\") " pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.369836 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rjp5j\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.369851 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0f4e9ccb-4580-4ea7-a7bd-181bab6530c0-signing-key\") pod \"service-ca-9c57cc56f-6bp9m\" (UID: \"0f4e9ccb-4580-4ea7-a7bd-181bab6530c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.369929 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btz2b\" (UniqueName: \"kubernetes.io/projected/e8c328d2-8ab9-446f-9cb0-ebc873785d90-kube-api-access-btz2b\") pod \"machine-config-server-92b88\" (UID: \"e8c328d2-8ab9-446f-9cb0-ebc873785d90\") " pod="openshift-machine-config-operator/machine-config-server-92b88" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.369957 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45b4ca9d-0850-4212-a6d4-9ff374d3e982-cert\") pod \"ingress-canary-54vkr\" (UID: \"45b4ca9d-0850-4212-a6d4-9ff374d3e982\") " pod="openshift-ingress-canary/ingress-canary-54vkr" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.369992 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab96097e-4edb-4abf-a7b4-c05a20f86659-srv-cert\") pod \"olm-operator-6b444d44fb-lvb2q\" (UID: \"ab96097e-4edb-4abf-a7b4-c05a20f86659\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.370018 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ebf1ddd-591b-408d-914e-eb316cabd08c-serving-cert\") pod \"service-ca-operator-777779d784-ds4fg\" (UID: \"5ebf1ddd-591b-408d-914e-eb316cabd08c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.370046 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2154c02b-316a-4776-944f-734586e04489-config-volume\") pod \"dns-default-bgrwl\" (UID: \"2154c02b-316a-4776-944f-734586e04489\") " pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.370072 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e8c328d2-8ab9-446f-9cb0-ebc873785d90-certs\") pod \"machine-config-server-92b88\" (UID: \"e8c328d2-8ab9-446f-9cb0-ebc873785d90\") " pod="openshift-machine-config-operator/machine-config-server-92b88" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.371023 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c9fc77b-189a-4fbc-9449-491f7a1700b9-etcd-client\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.372518 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/005bbea4-d7a5-413a-9c19-d9503a370566-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gjrbj\" (UID: \"005bbea4-d7a5-413a-9c19-d9503a370566\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.372561 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-apiservice-cert\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.372966 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e8c328d2-8ab9-446f-9cb0-ebc873785d90-certs\") pod \"machine-config-server-92b88\" (UID: \"e8c328d2-8ab9-446f-9cb0-ebc873785d90\") " pod="openshift-machine-config-operator/machine-config-server-92b88" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.373037 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2154c02b-316a-4776-944f-734586e04489-config-volume\") pod \"dns-default-bgrwl\" (UID: \"2154c02b-316a-4776-944f-734586e04489\") " pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.375649 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8bc7cfea-82a6-48a1-995a-b473d672a62d-metrics-tls\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.376357 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ebf1ddd-591b-408d-914e-eb316cabd08c-serving-cert\") pod \"service-ca-operator-777779d784-ds4fg\" (UID: \"5ebf1ddd-591b-408d-914e-eb316cabd08c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.376422 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff63da45-76c4-4b40-a65b-177b4bfa9feb-proxy-tls\") pod \"machine-config-controller-84d6567774-hndq5\" (UID: \"ff63da45-76c4-4b40-a65b-177b4bfa9feb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.376805 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c9fc77b-189a-4fbc-9449-491f7a1700b9-serving-cert\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.379667 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9fc77b-189a-4fbc-9449-491f7a1700b9-config\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.379968 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-webhook-cert\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.380078 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45b4ca9d-0850-4212-a6d4-9ff374d3e982-cert\") pod \"ingress-canary-54vkr\" (UID: \"45b4ca9d-0850-4212-a6d4-9ff374d3e982\") " pod="openshift-ingress-canary/ingress-canary-54vkr" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.380205 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab96097e-4edb-4abf-a7b4-c05a20f86659-srv-cert\") pod \"olm-operator-6b444d44fb-lvb2q\" (UID: \"ab96097e-4edb-4abf-a7b4-c05a20f86659\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.391435 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0f4e9ccb-4580-4ea7-a7bd-181bab6530c0-signing-key\") pod \"service-ca-9c57cc56f-6bp9m\" (UID: \"0f4e9ccb-4580-4ea7-a7bd-181bab6530c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.396739 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tj9j\" (UniqueName: \"kubernetes.io/projected/45b4ca9d-0850-4212-a6d4-9ff374d3e982-kube-api-access-4tj9j\") pod \"ingress-canary-54vkr\" (UID: \"45b4ca9d-0850-4212-a6d4-9ff374d3e982\") " pod="openshift-ingress-canary/ingress-canary-54vkr" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.399624 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab96097e-4edb-4abf-a7b4-c05a20f86659-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lvb2q\" (UID: \"ab96097e-4edb-4abf-a7b4-c05a20f86659\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.400241 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzjns\" (UniqueName: \"kubernetes.io/projected/29b5d949-e3d0-4f7b-9f11-5a37beb5ead2-kube-api-access-vzjns\") pod \"migrator-59844c95c7-htgvn\" (UID: \"29b5d949-e3d0-4f7b-9f11-5a37beb5ead2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.423426 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-54vkr" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.456679 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" event={"ID":"9d6eefb2-0ce4-4501-b443-d06c45efd41f","Type":"ContainerStarted","Data":"fe10202ac68b0ccb556ddea7cf412d328bc8bc39e07720b4015fdec1b7cb036f"} Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.460895 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zvvrp" event={"ID":"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76","Type":"ContainerStarted","Data":"dcd9c5787efcae285db792d467b08ff66d100780ebc29ab9123a66677461b103"} Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.462840 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lqxss" event={"ID":"f6977d44-d8ff-4d40-959f-024da50c53fe","Type":"ContainerStarted","Data":"b2b9378e0b9e22502db48d84622387f275e08bfe44fb26b32c2339a2196d4115"} Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.467563 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" event={"ID":"64802881-57b2-4263-b5d8-f3c4c224c692","Type":"ContainerStarted","Data":"c43e9295d59321d1c558fe99f533f70366460a3c6df8158329a787dd1837adf7"} Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.468622 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" event={"ID":"1ace060b-1d1d-4704-9655-78ab3599db2b","Type":"ContainerStarted","Data":"5910b43ae4b5795080e79748bcfd80d35e0630c52e4d3ab1793f11a8e83c1aa5"} Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.471263 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: E1003 14:03:13.471866 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:13.971851614 +0000 UTC m=+143.830577861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.511658 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" event={"ID":"6e97026a-c5d6-4767-8ff5-54fad15f7a49","Type":"ContainerStarted","Data":"ecd147587d0f39f4b675fa879346e8758764a2bb34cdfdc59e15cde79bd8fb0c"} Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.514931 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" event={"ID":"13703c39-6eda-487f-9d53-509c6042d515","Type":"ContainerStarted","Data":"e9af79bde6e3beb036679d946e48eced54f9e2dbafdb3f3c48005a952f92c0fb"} Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.518377 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" event={"ID":"3a0e6228-8a56-4b62-87f1-24eec9cffdd5","Type":"ContainerStarted","Data":"d0f2a46b1dfe2fabb66eaa550ecf85cb229986c21fbd51b5af218cd1dbfcf6cb"} Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.533763 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" event={"ID":"f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23","Type":"ContainerStarted","Data":"9190e3f0a541e06314c80747ac9a60f75b5fcd6ac59da8df18a36fb1cfbc6924"} Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.536472 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" event={"ID":"6b8981cc-75fe-4ecd-971f-f01c74e8fd74","Type":"ContainerStarted","Data":"5b131f35ee3718c726e206599a0cde558e947f1a8eeafea21f4e6bc75913ccbd"} Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.575745 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:13 crc kubenswrapper[4636]: E1003 14:03:13.576304 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.076283055 +0000 UTC m=+143.935009312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.599222 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.607388 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.615135 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2c7t\" (UniqueName: \"kubernetes.io/projected/2154c02b-316a-4776-944f-734586e04489-kube-api-access-w2c7t\") pod \"dns-default-bgrwl\" (UID: \"2154c02b-316a-4776-944f-734586e04489\") " pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.623976 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvjh2\" (UniqueName: \"kubernetes.io/projected/ff63da45-76c4-4b40-a65b-177b4bfa9feb-kube-api-access-pvjh2\") pod \"machine-config-controller-84d6567774-hndq5\" (UID: \"ff63da45-76c4-4b40-a65b-177b4bfa9feb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.630523 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.634274 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq8dk\" (UniqueName: \"kubernetes.io/projected/5ebf1ddd-591b-408d-914e-eb316cabd08c-kube-api-access-hq8dk\") pod \"service-ca-operator-777779d784-ds4fg\" (UID: \"5ebf1ddd-591b-408d-914e-eb316cabd08c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.634669 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rksq8\" (UniqueName: \"kubernetes.io/projected/005bbea4-d7a5-413a-9c19-d9503a370566-kube-api-access-rksq8\") pod \"package-server-manager-789f6589d5-gjrbj\" (UID: \"005bbea4-d7a5-413a-9c19-d9503a370566\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.639149 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.640861 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xh4x\" (UniqueName: \"kubernetes.io/projected/72d5c706-f441-4f26-99b0-c8979fb0c3f3-kube-api-access-5xh4x\") pod \"csi-hostpathplugin-ljmrj\" (UID: \"72d5c706-f441-4f26-99b0-c8979fb0c3f3\") " pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.641296 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shpqx\" (UniqueName: \"kubernetes.io/projected/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-kube-api-access-shpqx\") pod \"marketplace-operator-79b997595-rjp5j\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.643916 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4xqm\" (UniqueName: \"kubernetes.io/projected/ccb55971-c2d7-440a-bb7a-dcc1d9c0b562-kube-api-access-h4xqm\") pod \"packageserver-d55dfcdfc-r94vn\" (UID: \"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.644819 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lt6w\" (UniqueName: \"kubernetes.io/projected/0f4e9ccb-4580-4ea7-a7bd-181bab6530c0-kube-api-access-9lt6w\") pod \"service-ca-9c57cc56f-6bp9m\" (UID: \"0f4e9ccb-4580-4ea7-a7bd-181bab6530c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.645078 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.650879 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gt6v\" (UniqueName: \"kubernetes.io/projected/1c9fc77b-189a-4fbc-9449-491f7a1700b9-kube-api-access-2gt6v\") pod \"etcd-operator-b45778765-gnzm2\" (UID: \"1c9fc77b-189a-4fbc-9449-491f7a1700b9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.650882 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bc7cfea-82a6-48a1-995a-b473d672a62d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.655038 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.659246 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nng2k\" (UniqueName: \"kubernetes.io/projected/ab96097e-4edb-4abf-a7b4-c05a20f86659-kube-api-access-nng2k\") pod \"olm-operator-6b444d44fb-lvb2q\" (UID: \"ab96097e-4edb-4abf-a7b4-c05a20f86659\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.660402 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btz2b\" (UniqueName: \"kubernetes.io/projected/e8c328d2-8ab9-446f-9cb0-ebc873785d90-kube-api-access-btz2b\") pod \"machine-config-server-92b88\" (UID: \"e8c328d2-8ab9-446f-9cb0-ebc873785d90\") " pod="openshift-machine-config-operator/machine-config-server-92b88" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.663544 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.670594 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.671887 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fzp9w"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.677188 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: E1003 14:03:13.677597 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.177584027 +0000 UTC m=+144.036310284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.682504 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dwjx\" (UniqueName: \"kubernetes.io/projected/8bc7cfea-82a6-48a1-995a-b473d672a62d-kube-api-access-6dwjx\") pod \"ingress-operator-5b745b69d9-jmjtn\" (UID: \"8bc7cfea-82a6-48a1-995a-b473d672a62d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.687424 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.697189 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.705130 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c9dcg"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.705438 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.705612 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.708872 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.712177 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.739129 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.747020 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-92b88" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.778330 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:13 crc kubenswrapper[4636]: E1003 14:03:13.778692 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.278674324 +0000 UTC m=+144.137400571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.833366 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.879325 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: E1003 14:03:13.879848 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.379829112 +0000 UTC m=+144.238555389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.920092 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srw4g"] Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.929362 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" Oct 03 14:03:13 crc kubenswrapper[4636]: W1003 14:03:13.944958 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf02b12d5_48d5_48ed_81c7_db4e06189afe.slice/crio-f3d0b98c676fcf2bc28ea859d55050caf80fd8c6b38b16695bd9205149875fdc WatchSource:0}: Error finding container f3d0b98c676fcf2bc28ea859d55050caf80fd8c6b38b16695bd9205149875fdc: Status 404 returned error can't find the container with id f3d0b98c676fcf2bc28ea859d55050caf80fd8c6b38b16695bd9205149875fdc Oct 03 14:03:13 crc kubenswrapper[4636]: W1003 14:03:13.947970 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cbc4a55_551d_4314_bcb0_751f82313dc0.slice/crio-c28cb5bd94c2eb9d98571708d04da06a1dacbcfaf74eee448903f47128960178 WatchSource:0}: Error finding container c28cb5bd94c2eb9d98571708d04da06a1dacbcfaf74eee448903f47128960178: Status 404 returned error can't find the container with id c28cb5bd94c2eb9d98571708d04da06a1dacbcfaf74eee448903f47128960178 Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.978799 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.979883 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:13 crc kubenswrapper[4636]: E1003 14:03:13.980125 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.480090298 +0000 UTC m=+144.338816545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:13 crc kubenswrapper[4636]: I1003 14:03:13.980208 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:13 crc kubenswrapper[4636]: E1003 14:03:13.980507 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.480492118 +0000 UTC m=+144.339218365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: W1003 14:03:14.002677 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod010ef4ac_9542_4a76_a005_385439b1045c.slice/crio-4a6ead0a815cb8d4c734183f26d139625d9bca641df8471374c25a41fe039e01 WatchSource:0}: Error finding container 4a6ead0a815cb8d4c734183f26d139625d9bca641df8471374c25a41fe039e01: Status 404 returned error can't find the container with id 4a6ead0a815cb8d4c734183f26d139625d9bca641df8471374c25a41fe039e01 Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.082486 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.083250 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.583232597 +0000 UTC m=+144.441958854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.183857 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.184182 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.684170489 +0000 UTC m=+144.542896736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.194880 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv"] Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.284578 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.284847 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.784828455 +0000 UTC m=+144.643554702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.284912 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.285238 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.785228575 +0000 UTC m=+144.643954822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.325765 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4"] Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.386646 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.387019 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.886998159 +0000 UTC m=+144.745724406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.402402 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qzkgg"] Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.458337 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vzs54"] Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.488683 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.489052 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:14.98904054 +0000 UTC m=+144.847766787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.492421 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg"] Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.541294 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" event={"ID":"5fee0926-3042-4015-ad02-90f4306431ae","Type":"ContainerStarted","Data":"94dfe32afddccb21186973d8f448cf41099c2a5cd2365b0d9e0eae02e78e4274"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.543193 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" event={"ID":"8cbc4a55-551d-4314-bcb0-751f82313dc0","Type":"ContainerStarted","Data":"c28cb5bd94c2eb9d98571708d04da06a1dacbcfaf74eee448903f47128960178"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.544318 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" event={"ID":"0aa8bab3-482e-41e8-800e-9962a4146194","Type":"ContainerStarted","Data":"1192f1e3f78b4755d124ee516452a57c510958fe04d62947e5af4325f89087bf"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.545461 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" event={"ID":"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b","Type":"ContainerStarted","Data":"75e5382e7ed4d45748ddd8a9db58f76e8ed5cea24b2cb82abd62f6938e89cf10"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.547167 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" event={"ID":"c83442ff-933c-4f99-aae8-522e4dc94199","Type":"ContainerStarted","Data":"6e9e173aa4613c2e1fc08b5b096b215a094b84b490d7be92885e8a40a06c2281"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.547888 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" event={"ID":"f02b12d5-48d5-48ed-81c7-db4e06189afe","Type":"ContainerStarted","Data":"f3d0b98c676fcf2bc28ea859d55050caf80fd8c6b38b16695bd9205149875fdc"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.548556 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5q7j6" event={"ID":"40e1f28c-6d64-4fa1-b554-507ff389f115","Type":"ContainerStarted","Data":"50762f749be7f53dece2256824c26051e41610d0d29b03d55192a1850f7c65f4"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.549326 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" event={"ID":"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e","Type":"ContainerStarted","Data":"21cf2789602773320984a4fba55a264c02aced660d66c3c31cd2adb31025fe9b"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.549988 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" event={"ID":"8d6ca41b-f3b4-4bce-ad85-0150cfeb2362","Type":"ContainerStarted","Data":"56737fdaf5b9324cc2534bc74627a226a85e4faf14e96101ad4a802740916a47"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.551060 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zvvrp" event={"ID":"ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76","Type":"ContainerStarted","Data":"0baa39ca5347a3946588954c4277e76bd2d3cc2429eb1e131984c5384710626c"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.551683 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fzp9w" event={"ID":"55eb43d6-a42c-4b21-a8e9-82d2c75ee839","Type":"ContainerStarted","Data":"d779dc80db8681593dfb533e3ffa27c73cb294f35946d5c063d78b9f2d29a204"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.552821 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" event={"ID":"010ef4ac-9542-4a76-a005-385439b1045c","Type":"ContainerStarted","Data":"4a6ead0a815cb8d4c734183f26d139625d9bca641df8471374c25a41fe039e01"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.554691 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" event={"ID":"9d6eefb2-0ce4-4501-b443-d06c45efd41f","Type":"ContainerStarted","Data":"7c17becf7997c31a5da0ae14faf662f9b3968d31616ddf248427fbe6f9acee2d"} Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.589498 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.589693 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.089664005 +0000 UTC m=+144.948390262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.589947 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.590275 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.09026235 +0000 UTC m=+144.948988597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.594593 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2p8qq"] Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.692236 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.692466 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.192444494 +0000 UTC m=+145.051170741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.692408 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-54vkr"] Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.692803 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.693274 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.193262105 +0000 UTC m=+145.051988352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.707632 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn"] Oct 03 14:03:14 crc kubenswrapper[4636]: W1003 14:03:14.709934 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0499c819_4b67_4882_9354_f7b9d6d2adc7.slice/crio-9ae95ca95a42bdbe880259d185a176401cb7138d3dd6bd63a6e790b1a9ce8430 WatchSource:0}: Error finding container 9ae95ca95a42bdbe880259d185a176401cb7138d3dd6bd63a6e790b1a9ce8430: Status 404 returned error can't find the container with id 9ae95ca95a42bdbe880259d185a176401cb7138d3dd6bd63a6e790b1a9ce8430 Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.797966 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.798522 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.298505208 +0000 UTC m=+145.157231455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.845425 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bgrwl"] Oct 03 14:03:14 crc kubenswrapper[4636]: W1003 14:03:14.873371 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod308225d5_c374_4bb6_a967_020bf6e7173f.slice/crio-09e08701bac07f06c3cd86fdffffa836f3c92eeeedd733c4dfc8ee63c092be17 WatchSource:0}: Error finding container 09e08701bac07f06c3cd86fdffffa836f3c92eeeedd733c4dfc8ee63c092be17: Status 404 returned error can't find the container with id 09e08701bac07f06c3cd86fdffffa836f3c92eeeedd733c4dfc8ee63c092be17 Oct 03 14:03:14 crc kubenswrapper[4636]: W1003 14:03:14.876608 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0013e14d_2163_45f2_8a98_dbe6805e40d0.slice/crio-7424af4716b0e28a906511a3c6ef4cbc547785c7165d32cd77f71ec00b47f37c WatchSource:0}: Error finding container 7424af4716b0e28a906511a3c6ef4cbc547785c7165d32cd77f71ec00b47f37c: Status 404 returned error can't find the container with id 7424af4716b0e28a906511a3c6ef4cbc547785c7165d32cd77f71ec00b47f37c Oct 03 14:03:14 crc kubenswrapper[4636]: W1003 14:03:14.877678 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ebf1ddd_591b_408d_914e_eb316cabd08c.slice/crio-754f6bc4e8ba99b8fc88ee5d5bcc88eabd3522cdbd4f9a8b294570a952faf43b WatchSource:0}: Error finding container 754f6bc4e8ba99b8fc88ee5d5bcc88eabd3522cdbd4f9a8b294570a952faf43b: Status 404 returned error can't find the container with id 754f6bc4e8ba99b8fc88ee5d5bcc88eabd3522cdbd4f9a8b294570a952faf43b Oct 03 14:03:14 crc kubenswrapper[4636]: W1003 14:03:14.886528 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45b4ca9d_0850_4212_a6d4_9ff374d3e982.slice/crio-c6dac677b7194a33355b7cf6bb56efb81640b014fed896fd97244113e73ea41e WatchSource:0}: Error finding container c6dac677b7194a33355b7cf6bb56efb81640b014fed896fd97244113e73ea41e: Status 404 returned error can't find the container with id c6dac677b7194a33355b7cf6bb56efb81640b014fed896fd97244113e73ea41e Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.911984 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:14 crc kubenswrapper[4636]: E1003 14:03:14.912413 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.412401171 +0000 UTC m=+145.271127418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:14 crc kubenswrapper[4636]: W1003 14:03:14.920964 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2154c02b_316a_4776_944f_734586e04489.slice/crio-4caa8ed752bbcc2986a2eb597d55d248499cb63cab7594a39bb0d7113dcc933a WatchSource:0}: Error finding container 4caa8ed752bbcc2986a2eb597d55d248499cb63cab7594a39bb0d7113dcc933a: Status 404 returned error can't find the container with id 4caa8ed752bbcc2986a2eb597d55d248499cb63cab7594a39bb0d7113dcc933a Oct 03 14:03:14 crc kubenswrapper[4636]: I1003 14:03:14.945147 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5"] Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.012680 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.013163 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.513147228 +0000 UTC m=+145.371873475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.113893 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.114212 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.614200614 +0000 UTC m=+145.472926861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.217625 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.217981 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.717965309 +0000 UTC m=+145.576691556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.218187 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.218492 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.718485502 +0000 UTC m=+145.577211749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.319795 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.319983 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.819958778 +0000 UTC m=+145.678685025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.320032 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.320482 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.820475332 +0000 UTC m=+145.679201579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: W1003 14:03:15.356794 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c328d2_8ab9_446f_9cb0_ebc873785d90.slice/crio-5db59f9a06e026a737a2003b369e1d1dea61d79a7f448d05f5321a8941d08bd9 WatchSource:0}: Error finding container 5db59f9a06e026a737a2003b369e1d1dea61d79a7f448d05f5321a8941d08bd9: Status 404 returned error can't find the container with id 5db59f9a06e026a737a2003b369e1d1dea61d79a7f448d05f5321a8941d08bd9 Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.420717 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.420887 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.92085564 +0000 UTC m=+145.779581897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.420934 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.421517 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:15.921507707 +0000 UTC m=+145.780233954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.481784 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn"] Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.483013 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ljmrj"] Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.524031 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.524167 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.024133012 +0000 UTC m=+145.882859259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.524407 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.524666 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.024659496 +0000 UTC m=+145.883385743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.563267 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn"] Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.564064 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" event={"ID":"0013e14d-2163-45f2-8a98-dbe6805e40d0","Type":"ContainerStarted","Data":"7424af4716b0e28a906511a3c6ef4cbc547785c7165d32cd77f71ec00b47f37c"} Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.566608 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bgrwl" event={"ID":"2154c02b-316a-4776-944f-734586e04489","Type":"ContainerStarted","Data":"4caa8ed752bbcc2986a2eb597d55d248499cb63cab7594a39bb0d7113dcc933a"} Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.571805 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" event={"ID":"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264","Type":"ContainerStarted","Data":"5d8914ebcbbf9d784938c73d00efd8608616db81e3673d422ee5593daa2ae7f8"} Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.572950 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-54vkr" event={"ID":"45b4ca9d-0850-4212-a6d4-9ff374d3e982","Type":"ContainerStarted","Data":"c6dac677b7194a33355b7cf6bb56efb81640b014fed896fd97244113e73ea41e"} Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.575682 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" event={"ID":"ff63da45-76c4-4b40-a65b-177b4bfa9feb","Type":"ContainerStarted","Data":"54802790f7c046f92f7d17f9f80907fddbf89e031891058c95a82b467eb9f90c"} Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.577591 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv" event={"ID":"0499c819-4b67-4882-9354-f7b9d6d2adc7","Type":"ContainerStarted","Data":"9ae95ca95a42bdbe880259d185a176401cb7138d3dd6bd63a6e790b1a9ce8430"} Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.578056 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6bp9m"] Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.578638 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-92b88" event={"ID":"e8c328d2-8ab9-446f-9cb0-ebc873785d90","Type":"ContainerStarted","Data":"5db59f9a06e026a737a2003b369e1d1dea61d79a7f448d05f5321a8941d08bd9"} Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.579495 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" event={"ID":"e697897f-0594-48da-967d-e429421b8fec","Type":"ContainerStarted","Data":"27999d0a381f4c5474e9c066689268adf861f85a25de578149d0fd8858aa6e98"} Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.580346 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" event={"ID":"8c513f61-cee7-451f-b8a9-1dab425641a8","Type":"ContainerStarted","Data":"415a0d2a388e77176ff971efc40a15cfc0cd2cca6420979fb85425d2d981e630"} Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.581046 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" event={"ID":"308225d5-c374-4bb6-a967-020bf6e7173f","Type":"ContainerStarted","Data":"09e08701bac07f06c3cd86fdffffa836f3c92eeeedd733c4dfc8ee63c092be17"} Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.581117 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rjp5j"] Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.582033 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" event={"ID":"5ebf1ddd-591b-408d-914e-eb316cabd08c","Type":"ContainerStarted","Data":"754f6bc4e8ba99b8fc88ee5d5bcc88eabd3522cdbd4f9a8b294570a952faf43b"} Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.627555 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.627974 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.127957998 +0000 UTC m=+145.986684245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.639026 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj"] Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.640709 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gnzm2"] Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.646012 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q"] Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.649867 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn"] Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.728870 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.729308 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.22929075 +0000 UTC m=+146.088016997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.829531 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.829722 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.32969595 +0000 UTC m=+146.188422197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.830464 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.830877 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.33086896 +0000 UTC m=+146.189595207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:15 crc kubenswrapper[4636]: I1003 14:03:15.931692 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:15 crc kubenswrapper[4636]: E1003 14:03:15.931990 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.431952906 +0000 UTC m=+146.290679193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.032926 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:16 crc kubenswrapper[4636]: E1003 14:03:16.033192 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.533182836 +0000 UTC m=+146.391909083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: W1003 14:03:16.045404 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d5c706_f441_4f26_99b0_c8979fb0c3f3.slice/crio-3fa8235812eeab878eba91d574bfa6af823da727545e24634652d1582170b142 WatchSource:0}: Error finding container 3fa8235812eeab878eba91d574bfa6af823da727545e24634652d1582170b142: Status 404 returned error can't find the container with id 3fa8235812eeab878eba91d574bfa6af823da727545e24634652d1582170b142 Oct 03 14:03:16 crc kubenswrapper[4636]: W1003 14:03:16.057874 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bc7cfea_82a6_48a1_995a_b473d672a62d.slice/crio-e60ed334a26d39d9d78955b9e822a5791bbd5d09ee13e5607ac3d235b120d825 WatchSource:0}: Error finding container e60ed334a26d39d9d78955b9e822a5791bbd5d09ee13e5607ac3d235b120d825: Status 404 returned error can't find the container with id e60ed334a26d39d9d78955b9e822a5791bbd5d09ee13e5607ac3d235b120d825 Oct 03 14:03:16 crc kubenswrapper[4636]: W1003 14:03:16.059417 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf1dc0fc_7cd7_46c2_8d8f_ae889ce93aed.slice/crio-769365e1af4a59f983c2e0c3faf59ba99cc52b5b7dc891bd343302f93b44e050 WatchSource:0}: Error finding container 769365e1af4a59f983c2e0c3faf59ba99cc52b5b7dc891bd343302f93b44e050: Status 404 returned error can't find the container with id 769365e1af4a59f983c2e0c3faf59ba99cc52b5b7dc891bd343302f93b44e050 Oct 03 14:03:16 crc kubenswrapper[4636]: W1003 14:03:16.103462 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab96097e_4edb_4abf_a7b4_c05a20f86659.slice/crio-e38f43593713da26f843eef91c9b2b6a0c7e669f3c3525ac76902c76e7c2a731 WatchSource:0}: Error finding container e38f43593713da26f843eef91c9b2b6a0c7e669f3c3525ac76902c76e7c2a731: Status 404 returned error can't find the container with id e38f43593713da26f843eef91c9b2b6a0c7e669f3c3525ac76902c76e7c2a731 Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.134166 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:16 crc kubenswrapper[4636]: E1003 14:03:16.134529 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.634512769 +0000 UTC m=+146.493239016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.235896 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:16 crc kubenswrapper[4636]: E1003 14:03:16.236494 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.736478518 +0000 UTC m=+146.595204765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.337220 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:16 crc kubenswrapper[4636]: E1003 14:03:16.339658 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.839636377 +0000 UTC m=+146.698362644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.439907 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:16 crc kubenswrapper[4636]: E1003 14:03:16.440511 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:16.940498388 +0000 UTC m=+146.799224635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.541795 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:16 crc kubenswrapper[4636]: E1003 14:03:16.542674 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.042659872 +0000 UTC m=+146.901386119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.629437 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" event={"ID":"13703c39-6eda-487f-9d53-509c6042d515","Type":"ContainerStarted","Data":"e531c229435a819a58c8c325c49aa485284b0a6bc5eddcd3928d57350b523005"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.648415 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:16 crc kubenswrapper[4636]: E1003 14:03:16.648712 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.148700575 +0000 UTC m=+147.007426822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.659406 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" event={"ID":"3a0e6228-8a56-4b62-87f1-24eec9cffdd5","Type":"ContainerStarted","Data":"dbaa6d0f0b73f267b8c66ed6c9a6d235f24988fe78a8b65f708b98ac9ad3509b"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.668758 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn" event={"ID":"29b5d949-e3d0-4f7b-9f11-5a37beb5ead2","Type":"ContainerStarted","Data":"2f5e715a1b197dae1f70b204362133a3acc53026d7afe9632d6cdde472f5c342"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.680661 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lqxss" event={"ID":"f6977d44-d8ff-4d40-959f-024da50c53fe","Type":"ContainerStarted","Data":"f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.694365 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" event={"ID":"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562","Type":"ContainerStarted","Data":"ef098419d1e95007d7db90312d53574fde53f1cc60783579e4f5800936f77b5b"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.710706 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" event={"ID":"1ace060b-1d1d-4704-9655-78ab3599db2b","Type":"ContainerStarted","Data":"bfada01eb6406614fe81cb2a634a81f094502e59c11eed28fd6a0a5de00b99e5"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.726902 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" event={"ID":"6e97026a-c5d6-4767-8ff5-54fad15f7a49","Type":"ContainerStarted","Data":"7dbe1dcd36bf729c40baaf6c486f90c7ee7c0b698ee87f90b90fa44165c910f9"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.729057 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" event={"ID":"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed","Type":"ContainerStarted","Data":"769365e1af4a59f983c2e0c3faf59ba99cc52b5b7dc891bd343302f93b44e050"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.730698 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" event={"ID":"8d6ca41b-f3b4-4bce-ad85-0150cfeb2362","Type":"ContainerStarted","Data":"42e80a15e6c013e6cf7318d0186d1ca135c29b513902a45decb7ce600bf324f2"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.733999 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.738400 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-lqxss" podStartSLOduration=124.738381061 podStartE2EDuration="2m4.738381061s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:16.735687092 +0000 UTC m=+146.594413359" watchObservedRunningTime="2025-10-03 14:03:16.738381061 +0000 UTC m=+146.597107318" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.738889 4636 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-44xt7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.738929 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" podUID="8d6ca41b-f3b4-4bce-ad85-0150cfeb2362" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.747448 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" event={"ID":"64802881-57b2-4263-b5d8-f3c4c224c692","Type":"ContainerStarted","Data":"cf184242456787a73f2999d28c4eb1472742241b8b3a158f2e7c20f76ead3285"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.748781 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:16 crc kubenswrapper[4636]: E1003 14:03:16.748939 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.248914879 +0000 UTC m=+147.107641126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.749093 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:16 crc kubenswrapper[4636]: E1003 14:03:16.749428 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.249420732 +0000 UTC m=+147.108146969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.751946 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" event={"ID":"f83d7f1b-5df1-40e0-ac89-3cfca1fd0b23","Type":"ContainerStarted","Data":"a678a0ad7336c84b9ca4af8085caf8dad0b172a4dbc5d219f4171a0f6902b020"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.756773 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" event={"ID":"ab96097e-4edb-4abf-a7b4-c05a20f86659","Type":"ContainerStarted","Data":"e38f43593713da26f843eef91c9b2b6a0c7e669f3c3525ac76902c76e7c2a731"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.766463 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tpww2" podStartSLOduration=124.766438936 podStartE2EDuration="2m4.766438936s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:16.761706525 +0000 UTC m=+146.620432772" watchObservedRunningTime="2025-10-03 14:03:16.766438936 +0000 UTC m=+146.625165183" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.771254 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5q7j6" event={"ID":"40e1f28c-6d64-4fa1-b554-507ff389f115","Type":"ContainerStarted","Data":"980ddaf169e186d1951a2c48575a87579bb271831e49690ac775f61524d368c9"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.772204 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5q7j6" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.773302 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.773345 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.785434 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7fqgj" podStartSLOduration=124.785413769 podStartE2EDuration="2m4.785413769s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:16.783188593 +0000 UTC m=+146.641914850" watchObservedRunningTime="2025-10-03 14:03:16.785413769 +0000 UTC m=+146.644140016" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.786919 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" event={"ID":"0f4e9ccb-4580-4ea7-a7bd-181bab6530c0","Type":"ContainerStarted","Data":"86ef5a0aa4d2c2fa73d6c4f06a1368d314cddd92031ebf5a06df8c3a8ffe8eaa"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.834695 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" podStartSLOduration=123.834674875 podStartE2EDuration="2m3.834674875s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:16.833337081 +0000 UTC m=+146.692063328" watchObservedRunningTime="2025-10-03 14:03:16.834674875 +0000 UTC m=+146.693401122" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.850302 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" event={"ID":"8cbc4a55-551d-4314-bcb0-751f82313dc0","Type":"ContainerStarted","Data":"944406a3ef658d01289b61b2f23130ea76ef337d73e96828be3c760c9120795e"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.851466 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:16 crc kubenswrapper[4636]: E1003 14:03:16.852716 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.352697284 +0000 UTC m=+147.211423531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.861839 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5q7j6" podStartSLOduration=124.861822227 podStartE2EDuration="2m4.861822227s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:16.859674022 +0000 UTC m=+146.718400289" watchObservedRunningTime="2025-10-03 14:03:16.861822227 +0000 UTC m=+146.720548474" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.897312 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" event={"ID":"72d5c706-f441-4f26-99b0-c8979fb0c3f3","Type":"ContainerStarted","Data":"3fa8235812eeab878eba91d574bfa6af823da727545e24634652d1582170b142"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.898367 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" podStartSLOduration=124.898350878 podStartE2EDuration="2m4.898350878s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:16.89570495 +0000 UTC m=+146.754431197" watchObservedRunningTime="2025-10-03 14:03:16.898350878 +0000 UTC m=+146.757077125" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.935160 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9gjr4" podStartSLOduration=124.935140816 podStartE2EDuration="2m4.935140816s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:16.934033327 +0000 UTC m=+146.792759584" watchObservedRunningTime="2025-10-03 14:03:16.935140816 +0000 UTC m=+146.793867073" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.946369 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv" event={"ID":"0499c819-4b67-4882-9354-f7b9d6d2adc7","Type":"ContainerStarted","Data":"544ddb228e13449284a16cd0becff0f5a6750c2dde41d3f6c8047593e079de46"} Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.957720 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:16 crc kubenswrapper[4636]: E1003 14:03:16.959321 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.459303541 +0000 UTC m=+147.318029878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.967449 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-j8z62" podStartSLOduration=124.967433859 podStartE2EDuration="2m4.967433859s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:16.962607576 +0000 UTC m=+146.821333823" watchObservedRunningTime="2025-10-03 14:03:16.967433859 +0000 UTC m=+146.826160106" Oct 03 14:03:16 crc kubenswrapper[4636]: I1003 14:03:16.969273 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" event={"ID":"005bbea4-d7a5-413a-9c19-d9503a370566","Type":"ContainerStarted","Data":"2b433bf058692c329d4b3ce83945bad6332e1c76c46f7f7f22586dffbda209c8"} Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.030049 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fzp9w" event={"ID":"55eb43d6-a42c-4b21-a8e9-82d2c75ee839","Type":"ContainerStarted","Data":"733f09668ffae29ac58ca0bf966d3244d1665d218f87b771a46a2392b7ea6cf6"} Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.030551 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.031878 4636 patch_prober.go:28] interesting pod/console-operator-58897d9998-fzp9w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.031982 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fzp9w" podUID="55eb43d6-a42c-4b21-a8e9-82d2c75ee839" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.054369 4636 generic.go:334] "Generic (PLEG): container finished" podID="7cf0aef3-c9be-4539-91bb-0a26d7d2a82e" containerID="da3b36f92d57c5974546276e2f075af31be5a83f19ac1c9f69e658d8a8c9f5c4" exitCode=0 Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.054461 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" event={"ID":"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e","Type":"ContainerDied","Data":"da3b36f92d57c5974546276e2f075af31be5a83f19ac1c9f69e658d8a8c9f5c4"} Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.070557 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.073035 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.57301597 +0000 UTC m=+147.431742217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.091464 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-fzp9w" podStartSLOduration=125.09144602 podStartE2EDuration="2m5.09144602s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:17.08714302 +0000 UTC m=+146.945869277" watchObservedRunningTime="2025-10-03 14:03:17.09144602 +0000 UTC m=+146.950172267" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.091953 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r49hv" podStartSLOduration=124.091945462 podStartE2EDuration="2m4.091945462s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:16.990524377 +0000 UTC m=+146.849250644" watchObservedRunningTime="2025-10-03 14:03:17.091945462 +0000 UTC m=+146.950671699" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.119599 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" event={"ID":"f02b12d5-48d5-48ed-81c7-db4e06189afe","Type":"ContainerStarted","Data":"d8f1567883d299944d53f0a929bbfea2cbb51597e86ea5a7c945d1f97bcc2d57"} Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.129164 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" event={"ID":"1c9fc77b-189a-4fbc-9449-491f7a1700b9","Type":"ContainerStarted","Data":"6266e74968680458ea8a7c2cd93334e947c5668daa51e845dfcbae024318affc"} Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.145446 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" event={"ID":"8bc7cfea-82a6-48a1-995a-b473d672a62d","Type":"ContainerStarted","Data":"e60ed334a26d39d9d78955b9e822a5791bbd5d09ee13e5607ac3d235b120d825"} Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.173923 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.175731 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.675719118 +0000 UTC m=+147.534445365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.177016 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tw6g9" podStartSLOduration=125.17699463 podStartE2EDuration="2m5.17699463s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:17.156849197 +0000 UTC m=+147.015575464" watchObservedRunningTime="2025-10-03 14:03:17.17699463 +0000 UTC m=+147.035720867" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.198859 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" event={"ID":"c83442ff-933c-4f99-aae8-522e4dc94199","Type":"ContainerStarted","Data":"bbb904fb9f7270b1f5ce8dbb0028062a4f6ca5f67f77ef9901b0648945c60cdb"} Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.222946 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" event={"ID":"6b8981cc-75fe-4ecd-971f-f01c74e8fd74","Type":"ContainerStarted","Data":"3590e607bb939f0376f5f7907e81903fece6dc0ceaa6146bebd6e56508b19c14"} Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.223194 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.226223 4636 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mz2wd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.226282 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" podUID="6b8981cc-75fe-4ecd-971f-f01c74e8fd74" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.240172 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" event={"ID":"5fee0926-3042-4015-ad02-90f4306431ae","Type":"ContainerStarted","Data":"c52e1715e51449daca2f5274cd4342ceed71b77d14f2188ca0c40f668054d2a0"} Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.280627 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.281386 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.78135952 +0000 UTC m=+147.640085767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.285284 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" podStartSLOduration=125.28526045 podStartE2EDuration="2m5.28526045s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:17.273082529 +0000 UTC m=+147.131808786" watchObservedRunningTime="2025-10-03 14:03:17.28526045 +0000 UTC m=+147.143986697" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.369429 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-zvvrp" podStartSLOduration=125.369414104 podStartE2EDuration="2m5.369414104s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:17.313439958 +0000 UTC m=+147.172166205" watchObservedRunningTime="2025-10-03 14:03:17.369414104 +0000 UTC m=+147.228140351" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.372420 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-fqn9d" podStartSLOduration=125.372413111 podStartE2EDuration="2m5.372413111s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:17.369504987 +0000 UTC m=+147.228231234" watchObservedRunningTime="2025-10-03 14:03:17.372413111 +0000 UTC m=+147.231139358" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.385495 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.387651 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.887636739 +0000 UTC m=+147.746362986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.487477 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.488236 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.988216032 +0000 UTC m=+147.846942279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.488333 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.488948 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:17.988934911 +0000 UTC m=+147.847661158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.589507 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.589675 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:18.089654278 +0000 UTC m=+147.948380525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.590022 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.590344 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:18.090332505 +0000 UTC m=+147.949058752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.691549 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.691758 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:18.19173346 +0000 UTC m=+148.050459707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.691896 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.692313 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:18.192300954 +0000 UTC m=+148.051027271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.792554 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.792936 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:18.292917049 +0000 UTC m=+148.151643296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.893792 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.894463 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:18.394446836 +0000 UTC m=+148.253173083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.902140 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.905926 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:17 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:17 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:17 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.906176 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:17 crc kubenswrapper[4636]: I1003 14:03:17.994636 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:17 crc kubenswrapper[4636]: E1003 14:03:17.995631 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:18.495616845 +0000 UTC m=+148.354343092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.096517 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:18 crc kubenswrapper[4636]: E1003 14:03:18.097429 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:18.59741422 +0000 UTC m=+148.456140477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.198317 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:18 crc kubenswrapper[4636]: E1003 14:03:18.198669 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:18.69864174 +0000 UTC m=+148.557368027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.246968 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-54vkr" event={"ID":"45b4ca9d-0850-4212-a6d4-9ff374d3e982","Type":"ContainerStarted","Data":"6e4f9783954e05f93be9b105e5d83bd9b360d55a22ef2ba40de217033fa3d123"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.249896 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" event={"ID":"3a0e6228-8a56-4b62-87f1-24eec9cffdd5","Type":"ContainerStarted","Data":"ce973340ef4647786e5e45d8f9840ef68580e2a95a432b21ad96a84e2639e9a2"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.251644 4636 generic.go:334] "Generic (PLEG): container finished" podID="62f6cc0c-eb9d-44ef-8ce7-93a6148c3264" containerID="467e39b80a0ba32a621efc4362603ba421bb98cdf028b0b9fa4abec892c572cb" exitCode=0 Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.251725 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" event={"ID":"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264","Type":"ContainerDied","Data":"467e39b80a0ba32a621efc4362603ba421bb98cdf028b0b9fa4abec892c572cb"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.254307 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" event={"ID":"1c9fc77b-189a-4fbc-9449-491f7a1700b9","Type":"ContainerStarted","Data":"05b29a2743a1bccd7c91140d5ecf998cfa165897a33c318ccf43ef5ed1b20e2a"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.257326 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" event={"ID":"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed","Type":"ContainerStarted","Data":"aadee51d005ae8bf2c42a2dd7725e0294b6f25c145eedff34843e2748e4177de"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.257573 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.258780 4636 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rjp5j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.258899 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" podUID="bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.262504 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" event={"ID":"0f4e9ccb-4580-4ea7-a7bd-181bab6530c0","Type":"ContainerStarted","Data":"d952f3e379b653296d3ea85165c81f224959908d7f34b3d74d492f3677446a46"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.265612 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" event={"ID":"5fee0926-3042-4015-ad02-90f4306431ae","Type":"ContainerStarted","Data":"2ba5eb1d76a45b5ca0887454f990cca64ad33b0a85a6416eaaae3db5b379a8df"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.275788 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn" event={"ID":"29b5d949-e3d0-4f7b-9f11-5a37beb5ead2","Type":"ContainerStarted","Data":"9a921e892690bfc197d8310322f8e12eb7fb79fdb2cd9ff01324dacf0a147518"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.276042 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn" event={"ID":"29b5d949-e3d0-4f7b-9f11-5a37beb5ead2","Type":"ContainerStarted","Data":"f7c74a49195a244323ef2a61fddccaa07ff7af26b5ca162b5996d19ca71c1f02"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.281468 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bgrwl" event={"ID":"2154c02b-316a-4776-944f-734586e04489","Type":"ContainerStarted","Data":"c36827c4d73fba711b82ee58c89edf9b0cd350122d538bc628d1f3fc9a5bb2ef"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.281532 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bgrwl" event={"ID":"2154c02b-316a-4776-944f-734586e04489","Type":"ContainerStarted","Data":"2e6cd2014bb83330271b9adc26b698215ac840bf378d0174bc17f5bbdac0d7cd"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.281702 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.286684 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-54vkr" podStartSLOduration=8.286664603 podStartE2EDuration="8.286664603s" podCreationTimestamp="2025-10-03 14:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.284689023 +0000 UTC m=+148.143415270" watchObservedRunningTime="2025-10-03 14:03:18.286664603 +0000 UTC m=+148.145390850" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.292610 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" event={"ID":"c83442ff-933c-4f99-aae8-522e4dc94199","Type":"ContainerStarted","Data":"5839fb88e2379c13f2d41e50edf7762e1cdc09dadfe611e64d18d6c6c5e46f53"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.299593 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:18 crc kubenswrapper[4636]: E1003 14:03:18.300051 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:18.800035524 +0000 UTC m=+148.658761771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.300768 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" event={"ID":"010ef4ac-9542-4a76-a005-385439b1045c","Type":"ContainerStarted","Data":"43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.301747 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.302953 4636 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-srw4g container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.34:6443/healthz\": dial tcp 10.217.0.34:6443: connect: connection refused" start-of-body= Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.303271 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" podUID="010ef4ac-9542-4a76-a005-385439b1045c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.34:6443/healthz\": dial tcp 10.217.0.34:6443: connect: connection refused" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.303795 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-92b88" event={"ID":"e8c328d2-8ab9-446f-9cb0-ebc873785d90","Type":"ContainerStarted","Data":"601b08d0d7f650e0e4c14917023694657e6ece5c09f4bc198c1d0a059f247b73"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.312121 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" event={"ID":"ff63da45-76c4-4b40-a65b-177b4bfa9feb","Type":"ContainerStarted","Data":"ad8e9e33fefd66872d93ae7bd3b2addd9729db2df2926e7a02037e19b79e1129"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.312256 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" event={"ID":"ff63da45-76c4-4b40-a65b-177b4bfa9feb","Type":"ContainerStarted","Data":"6b10e002f1c4029ae3d2080649ca3a283723f1cac4eab41224e2986dcf923f8b"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.321446 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" event={"ID":"ab96097e-4edb-4abf-a7b4-c05a20f86659","Type":"ContainerStarted","Data":"b86f8c1d726f1718b2c3a15c9840b258b073efef23158ac3abf56508be36da3f"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.322492 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.323752 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-gnzm2" podStartSLOduration=126.323733478 podStartE2EDuration="2m6.323733478s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.320963518 +0000 UTC m=+148.179689765" watchObservedRunningTime="2025-10-03 14:03:18.323733478 +0000 UTC m=+148.182459725" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.329214 4636 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lvb2q container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.329293 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" podUID="ab96097e-4edb-4abf-a7b4-c05a20f86659" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.334275 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" event={"ID":"ccb55971-c2d7-440a-bb7a-dcc1d9c0b562","Type":"ContainerStarted","Data":"788cf3817450ca9abea68f588cfae715423a15aff91354424dbf6a1bcac716ee"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.335136 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.345453 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" event={"ID":"0aa8bab3-482e-41e8-800e-9962a4146194","Type":"ContainerStarted","Data":"3ac62a31e8666f162e6a8833b5f61e0933512ca01c9c8b6386250f083fee6ade"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.345528 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" event={"ID":"0aa8bab3-482e-41e8-800e-9962a4146194","Type":"ContainerStarted","Data":"820a25c90f2aa0f7b645ecf448055a9baf13abd40b7d5d4720cf89f839746c44"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.345873 4636 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-r94vn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.345914 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" podUID="ccb55971-c2d7-440a-bb7a-dcc1d9c0b562" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.348899 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" event={"ID":"7cf0aef3-c9be-4539-91bb-0a26d7d2a82e","Type":"ContainerStarted","Data":"7876580c82aec2800ad3681f5bb92bd4cb787f5b37c160599ed7aa50b3186322"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.349091 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.351821 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" event={"ID":"13703c39-6eda-487f-9d53-509c6042d515","Type":"ContainerStarted","Data":"cef711576f53a4e4a9e9d8bc4303fffb1fdfa29e79429f1e6c2b8e7213fde57a"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.354080 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" event={"ID":"8c513f61-cee7-451f-b8a9-1dab425641a8","Type":"ContainerStarted","Data":"7191bd6becaabd21136e7d6a55aa2e68dc1b4cdf91a4e59ae500408128f0522d"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.354336 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.355717 4636 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-pc4j4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.355755 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" podUID="8c513f61-cee7-451f-b8a9-1dab425641a8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.371031 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" event={"ID":"5ebf1ddd-591b-408d-914e-eb316cabd08c","Type":"ContainerStarted","Data":"67516aa46417fe2927c0c5aa71598420940b017eff1389d75dfc0132fb24d209"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.372735 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" event={"ID":"e697897f-0594-48da-967d-e429421b8fec","Type":"ContainerStarted","Data":"5774f981452c394c3c56c94345deda4c39e50f76e58f9e4cf1862e8b9c347bf6"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.372754 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" event={"ID":"e697897f-0594-48da-967d-e429421b8fec","Type":"ContainerStarted","Data":"6c1a18745ca8bcf38b6c202783f40b1c60c0833af63fb5f38420208b49c25031"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.374763 4636 generic.go:334] "Generic (PLEG): container finished" podID="0777cf5e-3bec-495f-8e8c-5d25b7a7b46b" containerID="0439be410b7a5527d31c27a4d4d3532459b3d6bd9181fa7437a2bfb18da2f7e1" exitCode=0 Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.374804 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" event={"ID":"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b","Type":"ContainerDied","Data":"0439be410b7a5527d31c27a4d4d3532459b3d6bd9181fa7437a2bfb18da2f7e1"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.382350 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" event={"ID":"308225d5-c374-4bb6-a967-020bf6e7173f","Type":"ContainerStarted","Data":"3f9afe534848bbb483b2b38b1fef363abe5b903800fb289d82e24c174ac48220"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.385112 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" event={"ID":"0013e14d-2163-45f2-8a98-dbe6805e40d0","Type":"ContainerStarted","Data":"9abc448494f608da9206ce940a2197cba892d86f4a3ccd89341f332e68a89079"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.389696 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6bp9m" podStartSLOduration=125.389665799 podStartE2EDuration="2m5.389665799s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.362515467 +0000 UTC m=+148.221241714" watchObservedRunningTime="2025-10-03 14:03:18.389665799 +0000 UTC m=+148.248392046" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.393122 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" event={"ID":"005bbea4-d7a5-413a-9c19-d9503a370566","Type":"ContainerStarted","Data":"0e41f7c58175e78197015223a9a150d4517f2acbaae722ea1651aa08cab773db"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.393164 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" event={"ID":"005bbea4-d7a5-413a-9c19-d9503a370566","Type":"ContainerStarted","Data":"802731187b765e5f468b21f566565cc20692b2b8de64f4181b21aab0ccf5a360"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.393555 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.401466 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:18 crc kubenswrapper[4636]: E1003 14:03:18.407559 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:18.907536284 +0000 UTC m=+148.766262531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.411703 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" event={"ID":"8bc7cfea-82a6-48a1-995a-b473d672a62d","Type":"ContainerStarted","Data":"884f757670ab47001fab45161b13bf54b40037f0151f6109238d5dec18d74754"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.411910 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" event={"ID":"8bc7cfea-82a6-48a1-995a-b473d672a62d","Type":"ContainerStarted","Data":"a34a6c44e235a156629204072e61fcbba9e0a37b4d6500b3de2c60434d0ce03a"} Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.413391 4636 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-44xt7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.413448 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" podUID="8d6ca41b-f3b4-4bce-ad85-0150cfeb2362" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.414511 4636 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mz2wd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.414548 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" podUID="6b8981cc-75fe-4ecd-971f-f01c74e8fd74" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.414621 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.414648 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.427355 4636 patch_prober.go:28] interesting pod/console-operator-58897d9998-fzp9w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.427398 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fzp9w" podUID="55eb43d6-a42c-4b21-a8e9-82d2c75ee839" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.439862 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pb9vh" podStartSLOduration=125.439844558 podStartE2EDuration="2m5.439844558s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.417911859 +0000 UTC m=+148.276638116" watchObservedRunningTime="2025-10-03 14:03:18.439844558 +0000 UTC m=+148.298570805" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.440267 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bgrwl" podStartSLOduration=8.440258378 podStartE2EDuration="8.440258378s" podCreationTimestamp="2025-10-03 14:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.389458343 +0000 UTC m=+148.248184590" watchObservedRunningTime="2025-10-03 14:03:18.440258378 +0000 UTC m=+148.298984625" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.454056 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:18 crc kubenswrapper[4636]: E1003 14:03:18.507359 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.007331528 +0000 UTC m=+148.866057875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.508698 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-fcxkp" podStartSLOduration=125.508681742 podStartE2EDuration="2m5.508681742s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.447087652 +0000 UTC m=+148.305813899" watchObservedRunningTime="2025-10-03 14:03:18.508681742 +0000 UTC m=+148.367407989" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.602983 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-htgvn" podStartSLOduration=125.602967895 podStartE2EDuration="2m5.602967895s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.536820259 +0000 UTC m=+148.395546506" watchObservedRunningTime="2025-10-03 14:03:18.602967895 +0000 UTC m=+148.461694142" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.603031 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:18 crc kubenswrapper[4636]: E1003 14:03:18.603127 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.103081068 +0000 UTC m=+148.961807315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.604645 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:18 crc kubenswrapper[4636]: E1003 14:03:18.608155 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.108143137 +0000 UTC m=+148.966869384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.684966 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" podStartSLOduration=125.684940975 podStartE2EDuration="2m5.684940975s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.6023644 +0000 UTC m=+148.461090647" watchObservedRunningTime="2025-10-03 14:03:18.684940975 +0000 UTC m=+148.543667232" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.711615 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:18 crc kubenswrapper[4636]: E1003 14:03:18.711968 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.211942933 +0000 UTC m=+149.070669180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.735376 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-877d7" podStartSLOduration=126.73535953 podStartE2EDuration="2m6.73535953s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.733776379 +0000 UTC m=+148.592502636" watchObservedRunningTime="2025-10-03 14:03:18.73535953 +0000 UTC m=+148.594085777" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.809675 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hndq5" podStartSLOduration=125.809657353 podStartE2EDuration="2m5.809657353s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.808708389 +0000 UTC m=+148.667434626" watchObservedRunningTime="2025-10-03 14:03:18.809657353 +0000 UTC m=+148.668383600" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.814854 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:18 crc kubenswrapper[4636]: E1003 14:03:18.815193 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.315178894 +0000 UTC m=+149.173905141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.902469 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:18 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:18 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:18 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.902529 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.913720 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c9dcg" podStartSLOduration=126.913697865 podStartE2EDuration="2m6.913697865s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.911584711 +0000 UTC m=+148.770310958" watchObservedRunningTime="2025-10-03 14:03:18.913697865 +0000 UTC m=+148.772424112" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.914667 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-92b88" podStartSLOduration=8.91465927 podStartE2EDuration="8.91465927s" podCreationTimestamp="2025-10-03 14:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.852348792 +0000 UTC m=+148.711075049" watchObservedRunningTime="2025-10-03 14:03:18.91465927 +0000 UTC m=+148.773385517" Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.915667 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:18 crc kubenswrapper[4636]: E1003 14:03:18.916143 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.416078576 +0000 UTC m=+149.274804823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:18 crc kubenswrapper[4636]: I1003 14:03:18.952618 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jmjtn" podStartSLOduration=126.952603927 podStartE2EDuration="2m6.952603927s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:18.950450222 +0000 UTC m=+148.809176469" watchObservedRunningTime="2025-10-03 14:03:18.952603927 +0000 UTC m=+148.811330164" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.018044 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.018564 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.518540338 +0000 UTC m=+149.377266695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.064910 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ds4fg" podStartSLOduration=126.064891459 podStartE2EDuration="2m6.064891459s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.025059694 +0000 UTC m=+148.883785941" watchObservedRunningTime="2025-10-03 14:03:19.064891459 +0000 UTC m=+148.923617706" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.098788 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qzkgg" podStartSLOduration=126.098769102 podStartE2EDuration="2m6.098769102s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.064478998 +0000 UTC m=+148.923205245" watchObservedRunningTime="2025-10-03 14:03:19.098769102 +0000 UTC m=+148.957495349" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.099748 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qfbfg" podStartSLOduration=127.099741677 podStartE2EDuration="2m7.099741677s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.098562907 +0000 UTC m=+148.957289154" watchObservedRunningTime="2025-10-03 14:03:19.099741677 +0000 UTC m=+148.958467924" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.119593 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.119936 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.619910101 +0000 UTC m=+149.478636348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.162611 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" podStartSLOduration=126.162595409 podStartE2EDuration="2m6.162595409s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.130584793 +0000 UTC m=+148.989311040" watchObservedRunningTime="2025-10-03 14:03:19.162595409 +0000 UTC m=+149.021321656" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.164124 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" podStartSLOduration=127.164115778 podStartE2EDuration="2m7.164115778s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.161353768 +0000 UTC m=+149.020080025" watchObservedRunningTime="2025-10-03 14:03:19.164115778 +0000 UTC m=+149.022842015" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.211623 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" podStartSLOduration=126.211605677 podStartE2EDuration="2m6.211605677s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.21051504 +0000 UTC m=+149.069241307" watchObservedRunningTime="2025-10-03 14:03:19.211605677 +0000 UTC m=+149.070331924" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.211822 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rsgnn" podStartSLOduration=127.211813653 podStartE2EDuration="2m7.211813653s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.182402193 +0000 UTC m=+149.041128450" watchObservedRunningTime="2025-10-03 14:03:19.211813653 +0000 UTC m=+149.070539900" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.220497 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.220856 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.720844593 +0000 UTC m=+149.579570830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.248888 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-vzs54" podStartSLOduration=127.248870247 podStartE2EDuration="2m7.248870247s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.236595444 +0000 UTC m=+149.095321691" watchObservedRunningTime="2025-10-03 14:03:19.248870247 +0000 UTC m=+149.107596494" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.303173 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" podStartSLOduration=127.303159111 podStartE2EDuration="2m7.303159111s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.301601631 +0000 UTC m=+149.160327878" watchObservedRunningTime="2025-10-03 14:03:19.303159111 +0000 UTC m=+149.161885358" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.321518 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.322085 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.822070873 +0000 UTC m=+149.680797120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.329733 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" podStartSLOduration=126.329714408 podStartE2EDuration="2m6.329714408s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.321045367 +0000 UTC m=+149.179771614" watchObservedRunningTime="2025-10-03 14:03:19.329714408 +0000 UTC m=+149.188440665" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.418017 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" event={"ID":"0777cf5e-3bec-495f-8e8c-5d25b7a7b46b","Type":"ContainerStarted","Data":"c50cfc9ce5d6cdbc640007a935e298095d7a79f6bc975e6f2b32d41e90fafe2e"} Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.419741 4636 generic.go:334] "Generic (PLEG): container finished" podID="64802881-57b2-4263-b5d8-f3c4c224c692" containerID="cf184242456787a73f2999d28c4eb1472742241b8b3a158f2e7c20f76ead3285" exitCode=0 Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.419787 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" event={"ID":"64802881-57b2-4263-b5d8-f3c4c224c692","Type":"ContainerDied","Data":"cf184242456787a73f2999d28c4eb1472742241b8b3a158f2e7c20f76ead3285"} Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.422976 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.423254 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:19.923243892 +0000 UTC m=+149.781970139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.423380 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" event={"ID":"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264","Type":"ContainerStarted","Data":"ac69e7343bf82b3b55e85c0634219444ecfaf00ba8d7017227e0234663dcf0f7"} Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.424689 4636 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lvb2q container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.424716 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" podUID="ab96097e-4edb-4abf-a7b4-c05a20f86659" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.425373 4636 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rjp5j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.425385 4636 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-r94vn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.425422 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" podUID="bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.425463 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" podUID="ccb55971-c2d7-440a-bb7a-dcc1d9c0b562" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.425600 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.425623 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.434993 4636 patch_prober.go:28] interesting pod/console-operator-58897d9998-fzp9w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.435253 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fzp9w" podUID="55eb43d6-a42c-4b21-a8e9-82d2c75ee839" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.441145 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.442907 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" podStartSLOduration=126.442890382 podStartE2EDuration="2m6.442890382s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.361844137 +0000 UTC m=+149.220570384" watchObservedRunningTime="2025-10-03 14:03:19.442890382 +0000 UTC m=+149.301616629" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.443307 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" podStartSLOduration=126.443299473 podStartE2EDuration="2m6.443299473s" podCreationTimestamp="2025-10-03 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:19.441524448 +0000 UTC m=+149.300250705" watchObservedRunningTime="2025-10-03 14:03:19.443299473 +0000 UTC m=+149.302025710" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.510427 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-44xt7" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.524497 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.524763 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.024739439 +0000 UTC m=+149.883465686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.526014 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.526430 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.026415321 +0000 UTC m=+149.885141568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.630943 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.631140 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.13111432 +0000 UTC m=+149.989840567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.631364 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.631689 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.131662144 +0000 UTC m=+149.990388391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.705795 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.733024 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.733320 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.233285174 +0000 UTC m=+150.092011421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.733544 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.733903 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.233885329 +0000 UTC m=+150.092611646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.835073 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.835274 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.835311 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.835368 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.835411 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.837260 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.337236254 +0000 UTC m=+150.195962501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.839782 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.843922 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.844026 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.846091 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.902413 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:19 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:19 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:19 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.902482 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:19 crc kubenswrapper[4636]: I1003 14:03:19.936673 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:19 crc kubenswrapper[4636]: E1003 14:03:19.937020 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.437005257 +0000 UTC m=+150.295731504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.012301 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.027117 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.033532 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.037312 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.042359 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.542317091 +0000 UTC m=+150.401043338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.138837 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.139162 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.639150869 +0000 UTC m=+150.497877116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.239844 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.240554 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.740537763 +0000 UTC m=+150.599264010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.302384 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.342807 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.343247 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.84321753 +0000 UTC m=+150.701943777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.443324 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.443649 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:20.94363463 +0000 UTC m=+150.802360867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.443725 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" event={"ID":"62f6cc0c-eb9d-44ef-8ce7-93a6148c3264","Type":"ContainerStarted","Data":"52cf2c3ab2020c55b883e1e839a48f559dc3dd9394fb065affec7301ec24ef32"} Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.445398 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" event={"ID":"72d5c706-f441-4f26-99b0-c8979fb0c3f3","Type":"ContainerStarted","Data":"652fb3648d207fa572f27b5519ccf94871b3f7d6579fa1497f168aca60e9bf46"} Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.447045 4636 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lvb2q container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.447071 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" podUID="ab96097e-4edb-4abf-a7b4-c05a20f86659" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.465014 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tzzbm"] Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.472915 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.483941 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.492749 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" podStartSLOduration=128.492733901 podStartE2EDuration="2m8.492733901s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:20.484227784 +0000 UTC m=+150.342954031" watchObservedRunningTime="2025-10-03 14:03:20.492733901 +0000 UTC m=+150.351460148" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.495035 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzzbm"] Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.545181 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7srxq\" (UniqueName: \"kubernetes.io/projected/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-kube-api-access-7srxq\") pod \"certified-operators-tzzbm\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.545351 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-catalog-content\") pod \"certified-operators-tzzbm\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.545425 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.545542 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-utilities\") pod \"certified-operators-tzzbm\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.551805 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.051788946 +0000 UTC m=+150.910515193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.648672 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7q4hj"] Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.649533 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.650607 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.650772 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.150747019 +0000 UTC m=+151.009473266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.650827 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-utilities\") pod \"certified-operators-tzzbm\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.650962 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7srxq\" (UniqueName: \"kubernetes.io/projected/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-kube-api-access-7srxq\") pod \"certified-operators-tzzbm\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.651054 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-catalog-content\") pod \"certified-operators-tzzbm\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.651110 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.651384 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.151377215 +0000 UTC m=+151.010103462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.651462 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-utilities\") pod \"certified-operators-tzzbm\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.651533 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-catalog-content\") pod \"certified-operators-tzzbm\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.656891 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.680551 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7srxq\" (UniqueName: \"kubernetes.io/projected/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-kube-api-access-7srxq\") pod \"certified-operators-tzzbm\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.708060 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7q4hj"] Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.752214 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.752363 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.252339758 +0000 UTC m=+151.111065995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.752405 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.752449 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-catalog-content\") pod \"community-operators-7q4hj\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.752487 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c45pj\" (UniqueName: \"kubernetes.io/projected/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-kube-api-access-c45pj\") pod \"community-operators-7q4hj\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.752502 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-utilities\") pod \"community-operators-7q4hj\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.752792 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.252779079 +0000 UTC m=+151.111505326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.822495 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.853752 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.854014 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-catalog-content\") pod \"community-operators-7q4hj\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.854059 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c45pj\" (UniqueName: \"kubernetes.io/projected/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-kube-api-access-c45pj\") pod \"community-operators-7q4hj\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.854076 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-utilities\") pod \"community-operators-7q4hj\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.854464 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-utilities\") pod \"community-operators-7q4hj\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.854595 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-catalog-content\") pod \"community-operators-7q4hj\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.854698 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.354680076 +0000 UTC m=+151.213406323 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.905254 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:20 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:20 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:20 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.905313 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.959734 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:20 crc kubenswrapper[4636]: E1003 14:03:20.959987 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.45997703 +0000 UTC m=+151.318703277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.976484 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p7txd"] Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.978153 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:03:20 crc kubenswrapper[4636]: I1003 14:03:20.985321 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c45pj\" (UniqueName: \"kubernetes.io/projected/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-kube-api-access-c45pj\") pod \"community-operators-7q4hj\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.045407 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7txd"] Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.060791 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.061748 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ppfb\" (UniqueName: \"kubernetes.io/projected/29d1406c-66ee-4666-9627-e62af43b4f3d-kube-api-access-9ppfb\") pod \"certified-operators-p7txd\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.061794 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-catalog-content\") pod \"certified-operators-p7txd\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.061843 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-utilities\") pod \"certified-operators-p7txd\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:03:21 crc kubenswrapper[4636]: E1003 14:03:21.061963 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.561944869 +0000 UTC m=+151.420671116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.164500 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ppfb\" (UniqueName: \"kubernetes.io/projected/29d1406c-66ee-4666-9627-e62af43b4f3d-kube-api-access-9ppfb\") pod \"certified-operators-p7txd\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.164534 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-catalog-content\") pod \"certified-operators-p7txd\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.164563 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.164608 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-utilities\") pod \"certified-operators-p7txd\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.165041 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-utilities\") pod \"certified-operators-p7txd\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:03:21 crc kubenswrapper[4636]: E1003 14:03:21.165345 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.665333374 +0000 UTC m=+151.524059621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.167388 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-catalog-content\") pod \"certified-operators-p7txd\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.203682 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-82bmp"] Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.204743 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.238141 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-82bmp"] Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.248792 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ppfb\" (UniqueName: \"kubernetes.io/projected/29d1406c-66ee-4666-9627-e62af43b4f3d-kube-api-access-9ppfb\") pod \"certified-operators-p7txd\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.263373 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.265571 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.265815 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-catalog-content\") pod \"community-operators-82bmp\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.265856 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-utilities\") pod \"community-operators-82bmp\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.265887 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2gsw\" (UniqueName: \"kubernetes.io/projected/6040820f-38e4-4416-8648-32025aee8fcb-kube-api-access-f2gsw\") pod \"community-operators-82bmp\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:03:21 crc kubenswrapper[4636]: E1003 14:03:21.265982 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.765967889 +0000 UTC m=+151.624694136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.326585 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.370801 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-utilities\") pod \"community-operators-82bmp\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.370836 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.370867 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2gsw\" (UniqueName: \"kubernetes.io/projected/6040820f-38e4-4416-8648-32025aee8fcb-kube-api-access-f2gsw\") pod \"community-operators-82bmp\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.370954 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-catalog-content\") pod \"community-operators-82bmp\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.371756 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-utilities\") pod \"community-operators-82bmp\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:03:21 crc kubenswrapper[4636]: E1003 14:03:21.371978 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.871966951 +0000 UTC m=+151.730693198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.372268 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-catalog-content\") pod \"community-operators-82bmp\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.417979 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2gsw\" (UniqueName: \"kubernetes.io/projected/6040820f-38e4-4416-8648-32025aee8fcb-kube-api-access-f2gsw\") pod \"community-operators-82bmp\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.449280 4636 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-r94vn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.449347 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" podUID="ccb55971-c2d7-440a-bb7a-dcc1d9c0b562" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.462310 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.463201 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.473468 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:21 crc kubenswrapper[4636]: E1003 14:03:21.473839 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:21.973824437 +0000 UTC m=+151.832550674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.477625 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cd0c38189f5f8b60a8e0b4e245bedc8ed12c49cf6924828cae4fe59e7a86642c"} Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.477667 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"76f95f8a418d54034ec754d18aca97b4fcb7d7174fdbeae30c7d68c628903c08"} Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.491519 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.491739 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.520439 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.524427 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.576801 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"61e0cf54-1f42-4b7d-9787-db8e0cf348aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.576839 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.576933 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"61e0cf54-1f42-4b7d-9787-db8e0cf348aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:03:21 crc kubenswrapper[4636]: E1003 14:03:21.579142 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:22.079126801 +0000 UTC m=+151.937853048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.677795 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.678039 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"61e0cf54-1f42-4b7d-9787-db8e0cf348aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.678113 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"61e0cf54-1f42-4b7d-9787-db8e0cf348aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:03:21 crc kubenswrapper[4636]: E1003 14:03:21.678426 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:22.178377011 +0000 UTC m=+152.037103258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.678450 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"61e0cf54-1f42-4b7d-9787-db8e0cf348aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.780316 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:21 crc kubenswrapper[4636]: E1003 14:03:21.780622 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:22.280608616 +0000 UTC m=+152.139334863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.791754 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"61e0cf54-1f42-4b7d-9787-db8e0cf348aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:03:21 crc kubenswrapper[4636]: W1003 14:03:21.793378 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-7172b32ef12c5cc93d3807d2cde12bf0b16c8871e33cc4eb0c708dab4b43a567 WatchSource:0}: Error finding container 7172b32ef12c5cc93d3807d2cde12bf0b16c8871e33cc4eb0c708dab4b43a567: Status 404 returned error can't find the container with id 7172b32ef12c5cc93d3807d2cde12bf0b16c8871e33cc4eb0c708dab4b43a567 Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.881743 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:21 crc kubenswrapper[4636]: E1003 14:03:21.882060 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:22.382045752 +0000 UTC m=+152.240771999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.906729 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:21 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:21 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:21 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.906777 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:21 crc kubenswrapper[4636]: I1003 14:03:21.983868 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:21 crc kubenswrapper[4636]: E1003 14:03:21.984435 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:22.484424261 +0000 UTC m=+152.343150508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.085514 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:22 crc kubenswrapper[4636]: E1003 14:03:22.086297 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:22.586276907 +0000 UTC m=+152.445003154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.089542 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.195451 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:22 crc kubenswrapper[4636]: E1003 14:03:22.197168 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:22.697147653 +0000 UTC m=+152.555873900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.314565 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:22 crc kubenswrapper[4636]: E1003 14:03:22.315154 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:22.81512935 +0000 UTC m=+152.673855597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.326567 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.415575 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x5qd\" (UniqueName: \"kubernetes.io/projected/64802881-57b2-4263-b5d8-f3c4c224c692-kube-api-access-6x5qd\") pod \"64802881-57b2-4263-b5d8-f3c4c224c692\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.416139 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64802881-57b2-4263-b5d8-f3c4c224c692-secret-volume\") pod \"64802881-57b2-4263-b5d8-f3c4c224c692\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.416176 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64802881-57b2-4263-b5d8-f3c4c224c692-config-volume\") pod \"64802881-57b2-4263-b5d8-f3c4c224c692\" (UID: \"64802881-57b2-4263-b5d8-f3c4c224c692\") " Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.416425 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:22 crc kubenswrapper[4636]: E1003 14:03:22.416735 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:22.91671783 +0000 UTC m=+152.775444077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.418661 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64802881-57b2-4263-b5d8-f3c4c224c692-config-volume" (OuterVolumeSpecName: "config-volume") pod "64802881-57b2-4263-b5d8-f3c4c224c692" (UID: "64802881-57b2-4263-b5d8-f3c4c224c692"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.443310 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64802881-57b2-4263-b5d8-f3c4c224c692-kube-api-access-6x5qd" (OuterVolumeSpecName: "kube-api-access-6x5qd") pod "64802881-57b2-4263-b5d8-f3c4c224c692" (UID: "64802881-57b2-4263-b5d8-f3c4c224c692"). InnerVolumeSpecName "kube-api-access-6x5qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.446207 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64802881-57b2-4263-b5d8-f3c4c224c692-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "64802881-57b2-4263-b5d8-f3c4c224c692" (UID: "64802881-57b2-4263-b5d8-f3c4c224c692"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.490750 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7172b32ef12c5cc93d3807d2cde12bf0b16c8871e33cc4eb0c708dab4b43a567"} Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.501587 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nswpx"] Oct 03 14:03:22 crc kubenswrapper[4636]: E1003 14:03:22.501761 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64802881-57b2-4263-b5d8-f3c4c224c692" containerName="collect-profiles" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.501771 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="64802881-57b2-4263-b5d8-f3c4c224c692" containerName="collect-profiles" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.501859 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="64802881-57b2-4263-b5d8-f3c4c224c692" containerName="collect-profiles" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.502586 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.509578 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.512305 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8cf950ff0bc118530254b1304fc332f3e469e968d571a12aec1d3d628f6fd270"} Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.524005 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.524242 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x5qd\" (UniqueName: \"kubernetes.io/projected/64802881-57b2-4263-b5d8-f3c4c224c692-kube-api-access-6x5qd\") on node \"crc\" DevicePath \"\"" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.524255 4636 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64802881-57b2-4263-b5d8-f3c4c224c692-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.524264 4636 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64802881-57b2-4263-b5d8-f3c4c224c692-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:03:22 crc kubenswrapper[4636]: E1003 14:03:22.524318 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:23.024303122 +0000 UTC m=+152.883029359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.536360 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.539952 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf" event={"ID":"64802881-57b2-4263-b5d8-f3c4c224c692","Type":"ContainerDied","Data":"c43e9295d59321d1c558fe99f533f70366460a3c6df8158329a787dd1837adf7"} Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.540003 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c43e9295d59321d1c558fe99f533f70366460a3c6df8158329a787dd1837adf7" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.617698 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7txd"] Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.626397 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-utilities\") pod \"redhat-marketplace-nswpx\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.626469 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-catalog-content\") pod \"redhat-marketplace-nswpx\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.626508 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.626586 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cvmt\" (UniqueName: \"kubernetes.io/projected/727c69c5-0eaa-4dba-b5a8-131486e3636e-kube-api-access-4cvmt\") pod \"redhat-marketplace-nswpx\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:03:22 crc kubenswrapper[4636]: E1003 14:03:22.628991 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:23.12897314 +0000 UTC m=+152.987699387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.639747 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nswpx"] Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.677534 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qz8k" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.728872 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:22 crc kubenswrapper[4636]: E1003 14:03:22.729401 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:23.229375569 +0000 UTC m=+153.088101816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.729789 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvmt\" (UniqueName: \"kubernetes.io/projected/727c69c5-0eaa-4dba-b5a8-131486e3636e-kube-api-access-4cvmt\") pod \"redhat-marketplace-nswpx\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.729860 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-utilities\") pod \"redhat-marketplace-nswpx\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.729878 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-catalog-content\") pod \"redhat-marketplace-nswpx\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.729913 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:22 crc kubenswrapper[4636]: E1003 14:03:22.730345 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:23.230333423 +0000 UTC m=+153.089059670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.730533 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-utilities\") pod \"redhat-marketplace-nswpx\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.730825 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-catalog-content\") pod \"redhat-marketplace-nswpx\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.757571 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.757611 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.759499 4636 patch_prober.go:28] interesting pod/console-f9d7485db-lqxss container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.759539 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lqxss" podUID="f6977d44-d8ff-4d40-959f-024da50c53fe" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.830619 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:22 crc kubenswrapper[4636]: E1003 14:03:22.831719 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:23.331693446 +0000 UTC m=+153.190419763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.882456 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bp525"] Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.904461 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.905261 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.916760 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:22 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:22 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:22 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.916807 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.918974 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cvmt\" (UniqueName: \"kubernetes.io/projected/727c69c5-0eaa-4dba-b5a8-131486e3636e-kube-api-access-4cvmt\") pod \"redhat-marketplace-nswpx\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.924422 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp525"] Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.931996 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:22 crc kubenswrapper[4636]: E1003 14:03:22.932469 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:23.432455944 +0000 UTC m=+153.291182191 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.983717 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:22 crc kubenswrapper[4636]: I1003 14:03:22.984826 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.002193 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.002235 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.002409 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.002425 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.039563 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.039707 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgkl7\" (UniqueName: \"kubernetes.io/projected/9a9d47ea-2f67-4930-b870-cb6a68815b0f-kube-api-access-vgkl7\") pod \"redhat-marketplace-bp525\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.039795 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-catalog-content\") pod \"redhat-marketplace-bp525\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.039891 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-utilities\") pod \"redhat-marketplace-bp525\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:03:23 crc kubenswrapper[4636]: E1003 14:03:23.040418 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:23.540403205 +0000 UTC m=+153.399129452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.100614 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.141436 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-utilities\") pod \"redhat-marketplace-bp525\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.141539 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgkl7\" (UniqueName: \"kubernetes.io/projected/9a9d47ea-2f67-4930-b870-cb6a68815b0f-kube-api-access-vgkl7\") pod \"redhat-marketplace-bp525\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.141570 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.141628 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-catalog-content\") pod \"redhat-marketplace-bp525\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.142036 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-catalog-content\") pod \"redhat-marketplace-bp525\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.142255 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-utilities\") pod \"redhat-marketplace-bp525\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:03:23 crc kubenswrapper[4636]: E1003 14:03:23.142685 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:23.642675312 +0000 UTC m=+153.501401559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.172840 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.199148 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7q4hj"] Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.259085 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgkl7\" (UniqueName: \"kubernetes.io/projected/9a9d47ea-2f67-4930-b870-cb6a68815b0f-kube-api-access-vgkl7\") pod \"redhat-marketplace-bp525\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.262532 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-82bmp"] Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.262636 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.262967 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:23 crc kubenswrapper[4636]: E1003 14:03:23.263024 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:23.763007479 +0000 UTC m=+153.621733726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.263065 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.263686 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.324412 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzzbm"] Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.364983 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:23 crc kubenswrapper[4636]: E1003 14:03:23.366324 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:23.866308522 +0000 UTC m=+153.725034769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.396051 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.466320 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:23 crc kubenswrapper[4636]: E1003 14:03:23.466682 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:23.96666836 +0000 UTC m=+153.825394607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.546169 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82bmp" event={"ID":"6040820f-38e4-4416-8648-32025aee8fcb","Type":"ContainerStarted","Data":"a4a47b065f9963273aeccab5feefd92aff40e1efcb58123e818cfa1d43c7ae9f"} Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.549852 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzzbm" event={"ID":"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34","Type":"ContainerStarted","Data":"9d2f835b2e5ffc0af843603389c31093b950af3fc45993cf76be590337013f03"} Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.553155 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9ba2fb8e95030c77bc274e77a7b855cbfe35e1d8d289da02b014b48a544be105"} Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.554032 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.559844 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"61e0cf54-1f42-4b7d-9787-db8e0cf348aa","Type":"ContainerStarted","Data":"207f741f8d2eb14043e82e209f1194229b47cdfbfeab9bac8d753f3f9e3bdfac"} Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.568084 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:23 crc kubenswrapper[4636]: E1003 14:03:23.568550 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:24.068537716 +0000 UTC m=+153.927263963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.569092 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" event={"ID":"72d5c706-f441-4f26-99b0-c8979fb0c3f3","Type":"ContainerStarted","Data":"808cf6c53bbbef37282fa416a70360fda14b311bee1b92657c5ad9062ddd78ba"} Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.574929 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7txd" event={"ID":"29d1406c-66ee-4666-9627-e62af43b4f3d","Type":"ContainerStarted","Data":"3aeb4fc0ddb342d056e5b7631307c27ffaea35e67b9cd36eba1735e149858d6f"} Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.574975 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7txd" event={"ID":"29d1406c-66ee-4666-9627-e62af43b4f3d","Type":"ContainerStarted","Data":"42fde5472f6366027d07d257c7142e66a6427f94635bf083d2977334020f6df9"} Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.581603 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"69e9f71f9b9a4bfbd97e1ce1300f5e98a36cfc0dca377245834529fc19516abb"} Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.597893 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q4hj" event={"ID":"3c76d8db-b385-49b1-b8cf-f7286f3e49c2","Type":"ContainerStarted","Data":"fa4f4aa981db7a9f369e06583da3251819c06265175662a399a8254fe838e68f"} Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.597963 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q4hj" event={"ID":"3c76d8db-b385-49b1-b8cf-f7286f3e49c2","Type":"ContainerStarted","Data":"0fdf2896d034ddd9e0a4a0185db96a833091514e533bb3de7d7b294c930135b6"} Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.615879 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcndt" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.670012 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:23 crc kubenswrapper[4636]: E1003 14:03:23.671256 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:24.171241384 +0000 UTC m=+154.029967631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.687084 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lvb2q" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.693521 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.704482 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.705134 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.721670 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r94vn" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.722561 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.722740 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.736334 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.772607 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95979078-756b-4028-9c35-bc46d056eade-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"95979078-756b-4028-9c35-bc46d056eade\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.772743 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95979078-756b-4028-9c35-bc46d056eade-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"95979078-756b-4028-9c35-bc46d056eade\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.772827 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:23 crc kubenswrapper[4636]: E1003 14:03:23.773895 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:24.27388237 +0000 UTC m=+154.132608617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.873850 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:23 crc kubenswrapper[4636]: E1003 14:03:23.874470 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:24.374423143 +0000 UTC m=+154.233149390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.875118 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95979078-756b-4028-9c35-bc46d056eade-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"95979078-756b-4028-9c35-bc46d056eade\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.875193 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.875226 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95979078-756b-4028-9c35-bc46d056eade-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"95979078-756b-4028-9c35-bc46d056eade\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.875322 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95979078-756b-4028-9c35-bc46d056eade-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"95979078-756b-4028-9c35-bc46d056eade\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:03:23 crc kubenswrapper[4636]: E1003 14:03:23.875800 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:24.375789858 +0000 UTC m=+154.234516105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.903261 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:23 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:23 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:23 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.903308 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:23 crc kubenswrapper[4636]: I1003 14:03:23.976404 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:23 crc kubenswrapper[4636]: E1003 14:03:23.976754 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:24.476739871 +0000 UTC m=+154.335466108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.056816 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-fzp9w" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.077731 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:24 crc kubenswrapper[4636]: E1003 14:03:24.078324 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:24.578308679 +0000 UTC m=+154.437034926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.144045 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zv2qw"] Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.145045 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.181609 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:24 crc kubenswrapper[4636]: E1003 14:03:24.182535 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:24.682521706 +0000 UTC m=+154.541247953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.183066 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95979078-756b-4028-9c35-bc46d056eade-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"95979078-756b-4028-9c35-bc46d056eade\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.215678 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.229541 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zv2qw"] Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.282909 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.282966 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhpzv\" (UniqueName: \"kubernetes.io/projected/8cc66cc3-d679-469f-9b34-021345e9007f-kube-api-access-nhpzv\") pod \"redhat-operators-zv2qw\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.283009 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-catalog-content\") pod \"redhat-operators-zv2qw\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.283054 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-utilities\") pod \"redhat-operators-zv2qw\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:03:24 crc kubenswrapper[4636]: E1003 14:03:24.283345 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:24.783333695 +0000 UTC m=+154.642059942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.326259 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp525"] Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.379258 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.402849 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.403056 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-catalog-content\") pod \"redhat-operators-zv2qw\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.403154 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-utilities\") pod \"redhat-operators-zv2qw\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.403204 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhpzv\" (UniqueName: \"kubernetes.io/projected/8cc66cc3-d679-469f-9b34-021345e9007f-kube-api-access-nhpzv\") pod \"redhat-operators-zv2qw\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:03:24 crc kubenswrapper[4636]: E1003 14:03:24.403387 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:24.903366375 +0000 UTC m=+154.762092622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.403827 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-catalog-content\") pod \"redhat-operators-zv2qw\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.403897 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-utilities\") pod \"redhat-operators-zv2qw\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.458449 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhpzv\" (UniqueName: \"kubernetes.io/projected/8cc66cc3-d679-469f-9b34-021345e9007f-kube-api-access-nhpzv\") pod \"redhat-operators-zv2qw\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:03:24 crc kubenswrapper[4636]: E1003 14:03:24.505321 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:25.005305453 +0000 UTC m=+154.864031700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.504865 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.507129 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wlscj"] Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.508077 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.517263 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nswpx"] Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.529545 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.551845 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlscj"] Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.608623 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.608836 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-utilities\") pod \"redhat-operators-wlscj\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.608866 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzx8n\" (UniqueName: \"kubernetes.io/projected/d9d200ad-14e6-46da-bc1c-a600294e8600-kube-api-access-vzx8n\") pod \"redhat-operators-wlscj\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.608918 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-catalog-content\") pod \"redhat-operators-wlscj\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:03:24 crc kubenswrapper[4636]: E1003 14:03:24.609027 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:25.109010726 +0000 UTC m=+154.967736973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.678041 4636 generic.go:334] "Generic (PLEG): container finished" podID="29d1406c-66ee-4666-9627-e62af43b4f3d" containerID="3aeb4fc0ddb342d056e5b7631307c27ffaea35e67b9cd36eba1735e149858d6f" exitCode=0 Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.678404 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7txd" event={"ID":"29d1406c-66ee-4666-9627-e62af43b4f3d","Type":"ContainerDied","Data":"3aeb4fc0ddb342d056e5b7631307c27ffaea35e67b9cd36eba1735e149858d6f"} Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.683472 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.686408 4636 generic.go:334] "Generic (PLEG): container finished" podID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" containerID="fa4f4aa981db7a9f369e06583da3251819c06265175662a399a8254fe838e68f" exitCode=0 Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.686461 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q4hj" event={"ID":"3c76d8db-b385-49b1-b8cf-f7286f3e49c2","Type":"ContainerDied","Data":"fa4f4aa981db7a9f369e06583da3251819c06265175662a399a8254fe838e68f"} Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.695706 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp525" event={"ID":"9a9d47ea-2f67-4930-b870-cb6a68815b0f","Type":"ContainerStarted","Data":"679832bac42d7c26bf4ebdc2a087546a858007d742bb12d67a58e1abe07a51f7"} Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.699046 4636 generic.go:334] "Generic (PLEG): container finished" podID="6040820f-38e4-4416-8648-32025aee8fcb" containerID="0086af18414e93472c560ec364dedea333390f0ffb9741d3fda61b6667a405d2" exitCode=0 Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.699124 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82bmp" event={"ID":"6040820f-38e4-4416-8648-32025aee8fcb","Type":"ContainerDied","Data":"0086af18414e93472c560ec364dedea333390f0ffb9741d3fda61b6667a405d2"} Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.711282 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-catalog-content\") pod \"redhat-operators-wlscj\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.711378 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-utilities\") pod \"redhat-operators-wlscj\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.711399 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzx8n\" (UniqueName: \"kubernetes.io/projected/d9d200ad-14e6-46da-bc1c-a600294e8600-kube-api-access-vzx8n\") pod \"redhat-operators-wlscj\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.711431 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:24 crc kubenswrapper[4636]: E1003 14:03:24.711690 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:25.211678803 +0000 UTC m=+155.070405050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.712042 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-catalog-content\") pod \"redhat-operators-wlscj\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.712258 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-utilities\") pod \"redhat-operators-wlscj\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.717069 4636 generic.go:334] "Generic (PLEG): container finished" podID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" containerID="1eb4d1f23e9db0c809a1d874809576253d8881828582bf5ab73d5b7a096f49f8" exitCode=0 Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.717181 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzzbm" event={"ID":"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34","Type":"ContainerDied","Data":"1eb4d1f23e9db0c809a1d874809576253d8881828582bf5ab73d5b7a096f49f8"} Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.727266 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nswpx" event={"ID":"727c69c5-0eaa-4dba-b5a8-131486e3636e","Type":"ContainerStarted","Data":"f032d7e083943c1c6c8586cbe76cfee7835bbcd44571efc9a7022030c10f9a38"} Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.746045 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"61e0cf54-1f42-4b7d-9787-db8e0cf348aa","Type":"ContainerStarted","Data":"948dde8998aaf1cd6d5e19ca78a6cfa50f76f45566d00ddd9bc26d6d6caeb20e"} Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.759351 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzx8n\" (UniqueName: \"kubernetes.io/projected/d9d200ad-14e6-46da-bc1c-a600294e8600-kube-api-access-vzx8n\") pod \"redhat-operators-wlscj\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.814705 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:24 crc kubenswrapper[4636]: E1003 14:03:24.815899 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:25.315885349 +0000 UTC m=+155.174611596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.830383 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.916193 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:24 crc kubenswrapper[4636]: E1003 14:03:24.916544 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:25.416533684 +0000 UTC m=+155.275259931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.950120 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:24 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:24 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:24 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:24 crc kubenswrapper[4636]: I1003 14:03:24.950157 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.018441 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.018782 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:25.51875094 +0000 UTC m=+155.377477177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.019009 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.019378 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:25.519370635 +0000 UTC m=+155.378096872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.070216 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.070199591 podStartE2EDuration="4.070199591s" podCreationTimestamp="2025-10-03 14:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:24.993384873 +0000 UTC m=+154.852111120" watchObservedRunningTime="2025-10-03 14:03:25.070199591 +0000 UTC m=+154.928925838" Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.128666 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.130681 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:25.630658562 +0000 UTC m=+155.489384809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.231608 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.232084 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:25.732072167 +0000 UTC m=+155.590798414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.263964 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zv2qw"] Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.335030 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.335639 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:25.835620606 +0000 UTC m=+155.694346853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.439146 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.439471 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:25.939458053 +0000 UTC m=+155.798184300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.542786 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.542926 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.042897359 +0000 UTC m=+155.901623606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.543242 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.543601 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.043589477 +0000 UTC m=+155.902315724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.550017 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 14:03:25 crc kubenswrapper[4636]: W1003 14:03:25.597752 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod95979078_756b_4028_9c35_bc46d056eade.slice/crio-08ae5a8ba6d8c3e58cf3811f39d5c8338e1c3096fd2a6282a44298920c5debe1 WatchSource:0}: Error finding container 08ae5a8ba6d8c3e58cf3811f39d5c8338e1c3096fd2a6282a44298920c5debe1: Status 404 returned error can't find the container with id 08ae5a8ba6d8c3e58cf3811f39d5c8338e1c3096fd2a6282a44298920c5debe1 Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.644632 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.645029 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.145013422 +0000 UTC m=+156.003739669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.681482 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlscj"] Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.745731 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.746023 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.246010346 +0000 UTC m=+156.104736593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.791584 4636 generic.go:334] "Generic (PLEG): container finished" podID="8cc66cc3-d679-469f-9b34-021345e9007f" containerID="1d956c3f04c3abf69cf8f2d22c171011fad73a6ccceb8158ac7ea24a3befc2e5" exitCode=0 Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.791672 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zv2qw" event={"ID":"8cc66cc3-d679-469f-9b34-021345e9007f","Type":"ContainerDied","Data":"1d956c3f04c3abf69cf8f2d22c171011fad73a6ccceb8158ac7ea24a3befc2e5"} Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.791708 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zv2qw" event={"ID":"8cc66cc3-d679-469f-9b34-021345e9007f","Type":"ContainerStarted","Data":"d2ed9f937758375c1da5307806db4308bb9f8504f76bbfa83001d2184d63c156"} Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.802947 4636 generic.go:334] "Generic (PLEG): container finished" podID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" containerID="5043dc50950d669fa4c7c93a4109a8c0496681504103306556ebc2d5dc9f6451" exitCode=0 Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.803030 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp525" event={"ID":"9a9d47ea-2f67-4930-b870-cb6a68815b0f","Type":"ContainerDied","Data":"5043dc50950d669fa4c7c93a4109a8c0496681504103306556ebc2d5dc9f6451"} Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.845606 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"95979078-756b-4028-9c35-bc46d056eade","Type":"ContainerStarted","Data":"08ae5a8ba6d8c3e58cf3811f39d5c8338e1c3096fd2a6282a44298920c5debe1"} Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.849356 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.849698 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.349679118 +0000 UTC m=+156.208405365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.858026 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlscj" event={"ID":"d9d200ad-14e6-46da-bc1c-a600294e8600","Type":"ContainerStarted","Data":"08138f5b364a95920a177bfa764127c3484c2ba021b5bb725872a11f229853fe"} Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.875985 4636 generic.go:334] "Generic (PLEG): container finished" podID="727c69c5-0eaa-4dba-b5a8-131486e3636e" containerID="73f743169419e0c48902a37f5f6c65c6362b2b07ea9654e3b3bdb41f93aefd2e" exitCode=0 Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.876056 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nswpx" event={"ID":"727c69c5-0eaa-4dba-b5a8-131486e3636e","Type":"ContainerDied","Data":"73f743169419e0c48902a37f5f6c65c6362b2b07ea9654e3b3bdb41f93aefd2e"} Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.880700 4636 generic.go:334] "Generic (PLEG): container finished" podID="61e0cf54-1f42-4b7d-9787-db8e0cf348aa" containerID="948dde8998aaf1cd6d5e19ca78a6cfa50f76f45566d00ddd9bc26d6d6caeb20e" exitCode=0 Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.880894 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"61e0cf54-1f42-4b7d-9787-db8e0cf348aa","Type":"ContainerDied","Data":"948dde8998aaf1cd6d5e19ca78a6cfa50f76f45566d00ddd9bc26d6d6caeb20e"} Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.911755 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:25 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:25 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:25 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.911851 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.917402 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" event={"ID":"72d5c706-f441-4f26-99b0-c8979fb0c3f3","Type":"ContainerStarted","Data":"9c63ee2d16d2b9d3651dc84409c805daa0064e39a9a52ed7d1e068917db70645"} Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.917450 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" event={"ID":"72d5c706-f441-4f26-99b0-c8979fb0c3f3","Type":"ContainerStarted","Data":"cfe858977e2dae184e9851883b586a95f16887ecd7dbad4f78b41211227f86df"} Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.950833 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:25 crc kubenswrapper[4636]: E1003 14:03:25.951626 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.451605116 +0000 UTC m=+156.310331363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.990628 4636 patch_prober.go:28] interesting pod/apiserver-76f77b778f-2p8qq container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 03 14:03:25 crc kubenswrapper[4636]: [+]log ok Oct 03 14:03:25 crc kubenswrapper[4636]: [+]etcd ok Oct 03 14:03:25 crc kubenswrapper[4636]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 03 14:03:25 crc kubenswrapper[4636]: [+]poststarthook/generic-apiserver-start-informers ok Oct 03 14:03:25 crc kubenswrapper[4636]: [+]poststarthook/max-in-flight-filter ok Oct 03 14:03:25 crc kubenswrapper[4636]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 03 14:03:25 crc kubenswrapper[4636]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 03 14:03:25 crc kubenswrapper[4636]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 03 14:03:25 crc kubenswrapper[4636]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 03 14:03:25 crc kubenswrapper[4636]: [+]poststarthook/project.openshift.io-projectcache ok Oct 03 14:03:25 crc kubenswrapper[4636]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 03 14:03:25 crc kubenswrapper[4636]: [+]poststarthook/openshift.io-startinformers ok Oct 03 14:03:25 crc kubenswrapper[4636]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 03 14:03:25 crc kubenswrapper[4636]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 03 14:03:25 crc kubenswrapper[4636]: livez check failed Oct 03 14:03:25 crc kubenswrapper[4636]: I1003 14:03:25.990698 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" podUID="62f6cc0c-eb9d-44ef-8ce7-93a6148c3264" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.053822 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:26 crc kubenswrapper[4636]: E1003 14:03:26.055317 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.555301199 +0000 UTC m=+156.414027446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.082467 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ljmrj" podStartSLOduration=16.082444751 podStartE2EDuration="16.082444751s" podCreationTimestamp="2025-10-03 14:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:26.080275566 +0000 UTC m=+155.939001823" watchObservedRunningTime="2025-10-03 14:03:26.082444751 +0000 UTC m=+155.941170998" Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.155374 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:26 crc kubenswrapper[4636]: E1003 14:03:26.155825 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.655803801 +0000 UTC m=+156.514530048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.258565 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:26 crc kubenswrapper[4636]: E1003 14:03:26.258998 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.758982311 +0000 UTC m=+156.617708558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.359863 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:26 crc kubenswrapper[4636]: E1003 14:03:26.360425 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.860409535 +0000 UTC m=+156.719135782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.461058 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:26 crc kubenswrapper[4636]: E1003 14:03:26.461380 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.961326877 +0000 UTC m=+156.820053124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.461478 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:26 crc kubenswrapper[4636]: E1003 14:03:26.461852 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:26.96182897 +0000 UTC m=+156.820555217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.563618 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:26 crc kubenswrapper[4636]: E1003 14:03:26.563978 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:27.063901801 +0000 UTC m=+156.922628058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.625921 4636 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.671882 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:26 crc kubenswrapper[4636]: E1003 14:03:26.672313 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:27.172292084 +0000 UTC m=+157.031018321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.775192 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:26 crc kubenswrapper[4636]: E1003 14:03:26.775856 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 14:03:27.275832983 +0000 UTC m=+157.134559220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.877061 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:26 crc kubenswrapper[4636]: E1003 14:03:26.877593 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 14:03:27.377576406 +0000 UTC m=+157.236302653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pk6zb" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.901599 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:26 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:26 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:26 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.902030 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.926403 4636 generic.go:334] "Generic (PLEG): container finished" podID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerID="002be0d0f665cbc1af4178597b1dca06a7885c1ff320ad53c77cf46f394b1db8" exitCode=0 Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.926749 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlscj" event={"ID":"d9d200ad-14e6-46da-bc1c-a600294e8600","Type":"ContainerDied","Data":"002be0d0f665cbc1af4178597b1dca06a7885c1ff320ad53c77cf46f394b1db8"} Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.929251 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"95979078-756b-4028-9c35-bc46d056eade","Type":"ContainerStarted","Data":"317556182b18253a2ffb2d51fb16c5b7348a466a01da870c0380075d034302ea"} Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.955663 4636 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-03T14:03:26.625955193Z","Handler":null,"Name":""} Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.970707 4636 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.970753 4636 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 03 14:03:26 crc kubenswrapper[4636]: I1003 14:03:26.978630 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.000986 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.048826 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.048801651 podStartE2EDuration="4.048801651s" podCreationTimestamp="2025-10-03 14:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:26.972387923 +0000 UTC m=+156.831114180" watchObservedRunningTime="2025-10-03 14:03:27.048801651 +0000 UTC m=+156.907527898" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.089392 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.361254 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.384466 4636 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.384561 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.394883 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kube-api-access\") pod \"61e0cf54-1f42-4b7d-9787-db8e0cf348aa\" (UID: \"61e0cf54-1f42-4b7d-9787-db8e0cf348aa\") " Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.394961 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kubelet-dir\") pod \"61e0cf54-1f42-4b7d-9787-db8e0cf348aa\" (UID: \"61e0cf54-1f42-4b7d-9787-db8e0cf348aa\") " Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.395988 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "61e0cf54-1f42-4b7d-9787-db8e0cf348aa" (UID: "61e0cf54-1f42-4b7d-9787-db8e0cf348aa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.405533 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "61e0cf54-1f42-4b7d-9787-db8e0cf348aa" (UID: "61e0cf54-1f42-4b7d-9787-db8e0cf348aa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.448859 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pk6zb\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.497620 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.497651 4636 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61e0cf54-1f42-4b7d-9787-db8e0cf348aa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.713680 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.923447 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:27 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:27 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:27 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.923508 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.982553 4636 generic.go:334] "Generic (PLEG): container finished" podID="95979078-756b-4028-9c35-bc46d056eade" containerID="317556182b18253a2ffb2d51fb16c5b7348a466a01da870c0380075d034302ea" exitCode=0 Oct 03 14:03:27 crc kubenswrapper[4636]: I1003 14:03:27.982765 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"95979078-756b-4028-9c35-bc46d056eade","Type":"ContainerDied","Data":"317556182b18253a2ffb2d51fb16c5b7348a466a01da870c0380075d034302ea"} Oct 03 14:03:28 crc kubenswrapper[4636]: I1003 14:03:28.025245 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"61e0cf54-1f42-4b7d-9787-db8e0cf348aa","Type":"ContainerDied","Data":"207f741f8d2eb14043e82e209f1194229b47cdfbfeab9bac8d753f3f9e3bdfac"} Oct 03 14:03:28 crc kubenswrapper[4636]: I1003 14:03:28.025292 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="207f741f8d2eb14043e82e209f1194229b47cdfbfeab9bac8d753f3f9e3bdfac" Oct 03 14:03:28 crc kubenswrapper[4636]: I1003 14:03:28.025371 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 14:03:28 crc kubenswrapper[4636]: I1003 14:03:28.251795 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:28 crc kubenswrapper[4636]: I1003 14:03:28.283416 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-2p8qq" Oct 03 14:03:28 crc kubenswrapper[4636]: I1003 14:03:28.437853 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pk6zb"] Oct 03 14:03:28 crc kubenswrapper[4636]: W1003 14:03:28.486518 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod967e0eca_11d1_4fb6_bba5_5fe993aaeac3.slice/crio-cb946e2061c6133754f8a032fd847b9ce6096233f41475de1ab4b60f7a4f37fe WatchSource:0}: Error finding container cb946e2061c6133754f8a032fd847b9ce6096233f41475de1ab4b60f7a4f37fe: Status 404 returned error can't find the container with id cb946e2061c6133754f8a032fd847b9ce6096233f41475de1ab4b60f7a4f37fe Oct 03 14:03:28 crc kubenswrapper[4636]: I1003 14:03:28.742907 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bgrwl" Oct 03 14:03:28 crc kubenswrapper[4636]: I1003 14:03:28.825115 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 03 14:03:28 crc kubenswrapper[4636]: I1003 14:03:28.900618 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:28 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:28 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:28 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:28 crc kubenswrapper[4636]: I1003 14:03:28.900709 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.110066 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" event={"ID":"967e0eca-11d1-4fb6-bba5-5fe993aaeac3","Type":"ContainerStarted","Data":"bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132"} Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.110154 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" event={"ID":"967e0eca-11d1-4fb6-bba5-5fe993aaeac3","Type":"ContainerStarted","Data":"cb946e2061c6133754f8a032fd847b9ce6096233f41475de1ab4b60f7a4f37fe"} Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.110303 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.142523 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" podStartSLOduration=137.142510495 podStartE2EDuration="2m17.142510495s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:03:29.141174851 +0000 UTC m=+158.999901098" watchObservedRunningTime="2025-10-03 14:03:29.142510495 +0000 UTC m=+159.001236742" Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.590333 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.749686 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95979078-756b-4028-9c35-bc46d056eade-kubelet-dir\") pod \"95979078-756b-4028-9c35-bc46d056eade\" (UID: \"95979078-756b-4028-9c35-bc46d056eade\") " Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.749804 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95979078-756b-4028-9c35-bc46d056eade-kube-api-access\") pod \"95979078-756b-4028-9c35-bc46d056eade\" (UID: \"95979078-756b-4028-9c35-bc46d056eade\") " Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.750262 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95979078-756b-4028-9c35-bc46d056eade-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "95979078-756b-4028-9c35-bc46d056eade" (UID: "95979078-756b-4028-9c35-bc46d056eade"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.756241 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95979078-756b-4028-9c35-bc46d056eade-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "95979078-756b-4028-9c35-bc46d056eade" (UID: "95979078-756b-4028-9c35-bc46d056eade"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.851588 4636 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95979078-756b-4028-9c35-bc46d056eade-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.851615 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95979078-756b-4028-9c35-bc46d056eade-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.904653 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:29 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:29 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:29 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:29 crc kubenswrapper[4636]: I1003 14:03:29.904753 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:30 crc kubenswrapper[4636]: I1003 14:03:30.135370 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 14:03:30 crc kubenswrapper[4636]: I1003 14:03:30.135363 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"95979078-756b-4028-9c35-bc46d056eade","Type":"ContainerDied","Data":"08ae5a8ba6d8c3e58cf3811f39d5c8338e1c3096fd2a6282a44298920c5debe1"} Oct 03 14:03:30 crc kubenswrapper[4636]: I1003 14:03:30.135424 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ae5a8ba6d8c3e58cf3811f39d5c8338e1c3096fd2a6282a44298920c5debe1" Oct 03 14:03:30 crc kubenswrapper[4636]: I1003 14:03:30.899592 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:30 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:30 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:30 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:30 crc kubenswrapper[4636]: I1003 14:03:30.899638 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:31 crc kubenswrapper[4636]: I1003 14:03:31.900140 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:31 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:31 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:31 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:31 crc kubenswrapper[4636]: I1003 14:03:31.900192 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:32 crc kubenswrapper[4636]: I1003 14:03:32.754902 4636 patch_prober.go:28] interesting pod/console-f9d7485db-lqxss container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 03 14:03:32 crc kubenswrapper[4636]: I1003 14:03:32.755298 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lqxss" podUID="f6977d44-d8ff-4d40-959f-024da50c53fe" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 03 14:03:32 crc kubenswrapper[4636]: I1003 14:03:32.909812 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:32 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:32 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:32 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:32 crc kubenswrapper[4636]: I1003 14:03:32.911322 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:32 crc kubenswrapper[4636]: I1003 14:03:32.997371 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:03:32 crc kubenswrapper[4636]: I1003 14:03:32.997421 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:03:32 crc kubenswrapper[4636]: I1003 14:03:32.999341 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:03:32 crc kubenswrapper[4636]: I1003 14:03:32.999371 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:03:33 crc kubenswrapper[4636]: I1003 14:03:33.900793 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:33 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:33 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:33 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:33 crc kubenswrapper[4636]: I1003 14:03:33.900900 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:34 crc kubenswrapper[4636]: I1003 14:03:34.901635 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:34 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:34 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:34 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:34 crc kubenswrapper[4636]: I1003 14:03:34.901694 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:34 crc kubenswrapper[4636]: I1003 14:03:34.938265 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:03:34 crc kubenswrapper[4636]: I1003 14:03:34.946978 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7f8fb91-fbef-43b5-b771-f376cfbb1cdd-metrics-certs\") pod \"network-metrics-daemon-vm9z7\" (UID: \"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd\") " pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:03:35 crc kubenswrapper[4636]: I1003 14:03:35.020553 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm9z7" Oct 03 14:03:35 crc kubenswrapper[4636]: I1003 14:03:35.901827 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:35 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:35 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:35 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:35 crc kubenswrapper[4636]: I1003 14:03:35.902501 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:36 crc kubenswrapper[4636]: I1003 14:03:36.901465 4636 patch_prober.go:28] interesting pod/router-default-5444994796-zvvrp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 14:03:36 crc kubenswrapper[4636]: [-]has-synced failed: reason withheld Oct 03 14:03:36 crc kubenswrapper[4636]: [+]process-running ok Oct 03 14:03:36 crc kubenswrapper[4636]: healthz check failed Oct 03 14:03:36 crc kubenswrapper[4636]: I1003 14:03:36.901535 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zvvrp" podUID="ddcdaf13-a4b8-43c6-9e69-2fd8d8594f76" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 14:03:37 crc kubenswrapper[4636]: I1003 14:03:37.902393 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:37 crc kubenswrapper[4636]: I1003 14:03:37.907881 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-zvvrp" Oct 03 14:03:39 crc kubenswrapper[4636]: I1003 14:03:39.163548 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:03:39 crc kubenswrapper[4636]: I1003 14:03:39.163611 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:03:42 crc kubenswrapper[4636]: I1003 14:03:42.757937 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:42 crc kubenswrapper[4636]: I1003 14:03:42.762032 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:03:42 crc kubenswrapper[4636]: I1003 14:03:42.997652 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:03:42 crc kubenswrapper[4636]: I1003 14:03:42.997652 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:03:42 crc kubenswrapper[4636]: I1003 14:03:42.997708 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:03:42 crc kubenswrapper[4636]: I1003 14:03:42.997735 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:03:42 crc kubenswrapper[4636]: I1003 14:03:42.997790 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-5q7j6" Oct 03 14:03:42 crc kubenswrapper[4636]: I1003 14:03:42.998298 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:03:42 crc kubenswrapper[4636]: I1003 14:03:42.998324 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:03:42 crc kubenswrapper[4636]: I1003 14:03:42.998435 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"980ddaf169e186d1951a2c48575a87579bb271831e49690ac775f61524d368c9"} pod="openshift-console/downloads-7954f5f757-5q7j6" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 03 14:03:42 crc kubenswrapper[4636]: I1003 14:03:42.998529 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" containerID="cri-o://980ddaf169e186d1951a2c48575a87579bb271831e49690ac775f61524d368c9" gracePeriod=2 Oct 03 14:03:44 crc kubenswrapper[4636]: I1003 14:03:44.302728 4636 generic.go:334] "Generic (PLEG): container finished" podID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerID="980ddaf169e186d1951a2c48575a87579bb271831e49690ac775f61524d368c9" exitCode=0 Oct 03 14:03:44 crc kubenswrapper[4636]: I1003 14:03:44.302828 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5q7j6" event={"ID":"40e1f28c-6d64-4fa1-b554-507ff389f115","Type":"ContainerDied","Data":"980ddaf169e186d1951a2c48575a87579bb271831e49690ac775f61524d368c9"} Oct 03 14:03:47 crc kubenswrapper[4636]: I1003 14:03:47.719277 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:03:52 crc kubenswrapper[4636]: I1003 14:03:52.998382 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:03:52 crc kubenswrapper[4636]: I1003 14:03:52.999279 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:03:53 crc kubenswrapper[4636]: I1003 14:03:53.650414 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gjrbj" Oct 03 14:04:00 crc kubenswrapper[4636]: I1003 14:04:00.046263 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 14:04:02 crc kubenswrapper[4636]: I1003 14:04:02.997599 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:04:02 crc kubenswrapper[4636]: I1003 14:04:02.997945 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:04:09 crc kubenswrapper[4636]: I1003 14:04:09.162646 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:04:09 crc kubenswrapper[4636]: I1003 14:04:09.163201 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:04:12 crc kubenswrapper[4636]: I1003 14:04:12.997327 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:04:12 crc kubenswrapper[4636]: I1003 14:04:12.997396 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:04:15 crc kubenswrapper[4636]: E1003 14:04:15.004357 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 14:04:15 crc kubenswrapper[4636]: E1003 14:04:15.004961 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vzx8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-wlscj_openshift-marketplace(d9d200ad-14e6-46da-bc1c-a600294e8600): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:04:15 crc kubenswrapper[4636]: E1003 14:04:15.006152 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-wlscj" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" Oct 03 14:04:16 crc kubenswrapper[4636]: E1003 14:04:16.301496 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-wlscj" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" Oct 03 14:04:16 crc kubenswrapper[4636]: E1003 14:04:16.411149 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 14:04:16 crc kubenswrapper[4636]: E1003 14:04:16.411322 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7srxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tzzbm_openshift-marketplace(a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:04:16 crc kubenswrapper[4636]: E1003 14:04:16.412504 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tzzbm" podUID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" Oct 03 14:04:16 crc kubenswrapper[4636]: E1003 14:04:16.424787 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 14:04:16 crc kubenswrapper[4636]: E1003 14:04:16.424940 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nhpzv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zv2qw_openshift-marketplace(8cc66cc3-d679-469f-9b34-021345e9007f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:04:16 crc kubenswrapper[4636]: E1003 14:04:16.426878 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zv2qw" podUID="8cc66cc3-d679-469f-9b34-021345e9007f" Oct 03 14:04:17 crc kubenswrapper[4636]: E1003 14:04:17.874828 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zv2qw" podUID="8cc66cc3-d679-469f-9b34-021345e9007f" Oct 03 14:04:17 crc kubenswrapper[4636]: E1003 14:04:17.881934 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tzzbm" podUID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" Oct 03 14:04:17 crc kubenswrapper[4636]: E1003 14:04:17.956816 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 14:04:17 crc kubenswrapper[4636]: E1003 14:04:17.956973 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c45pj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7q4hj_openshift-marketplace(3c76d8db-b385-49b1-b8cf-f7286f3e49c2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:04:17 crc kubenswrapper[4636]: E1003 14:04:17.958146 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7q4hj" podUID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" Oct 03 14:04:18 crc kubenswrapper[4636]: E1003 14:04:18.692368 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7q4hj" podUID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" Oct 03 14:04:18 crc kubenswrapper[4636]: E1003 14:04:18.812257 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 14:04:18 crc kubenswrapper[4636]: E1003 14:04:18.821575 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vgkl7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bp525_openshift-marketplace(9a9d47ea-2f67-4930-b870-cb6a68815b0f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:04:18 crc kubenswrapper[4636]: E1003 14:04:18.824640 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bp525" podUID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" Oct 03 14:04:19 crc kubenswrapper[4636]: E1003 14:04:19.001128 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 14:04:19 crc kubenswrapper[4636]: E1003 14:04:19.001392 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f2gsw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-82bmp_openshift-marketplace(6040820f-38e4-4416-8648-32025aee8fcb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:04:19 crc kubenswrapper[4636]: E1003 14:04:19.002553 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-82bmp" podUID="6040820f-38e4-4416-8648-32025aee8fcb" Oct 03 14:04:19 crc kubenswrapper[4636]: I1003 14:04:19.120485 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vm9z7"] Oct 03 14:04:19 crc kubenswrapper[4636]: W1003 14:04:19.125587 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f8fb91_fbef_43b5_b771_f376cfbb1cdd.slice/crio-29b94c7e5e2c10a7fccf649b5fec67096633b4c067ccf67904a775e0a0dee8ba WatchSource:0}: Error finding container 29b94c7e5e2c10a7fccf649b5fec67096633b4c067ccf67904a775e0a0dee8ba: Status 404 returned error can't find the container with id 29b94c7e5e2c10a7fccf649b5fec67096633b4c067ccf67904a775e0a0dee8ba Oct 03 14:04:19 crc kubenswrapper[4636]: I1003 14:04:19.483072 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" event={"ID":"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd","Type":"ContainerStarted","Data":"29b94c7e5e2c10a7fccf649b5fec67096633b4c067ccf67904a775e0a0dee8ba"} Oct 03 14:04:19 crc kubenswrapper[4636]: I1003 14:04:19.489542 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5q7j6" event={"ID":"40e1f28c-6d64-4fa1-b554-507ff389f115","Type":"ContainerStarted","Data":"9fe34d6b8091b15fa2fcd7b170a68e414ef609d94b81613f6ab92918c00d6540"} Oct 03 14:04:19 crc kubenswrapper[4636]: I1003 14:04:19.489611 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5q7j6" Oct 03 14:04:19 crc kubenswrapper[4636]: E1003 14:04:19.491419 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bp525" podUID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" Oct 03 14:04:19 crc kubenswrapper[4636]: I1003 14:04:19.491491 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:04:19 crc kubenswrapper[4636]: I1003 14:04:19.491541 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:04:19 crc kubenswrapper[4636]: E1003 14:04:19.491855 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-82bmp" podUID="6040820f-38e4-4416-8648-32025aee8fcb" Oct 03 14:04:19 crc kubenswrapper[4636]: E1003 14:04:19.587582 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 14:04:19 crc kubenswrapper[4636]: E1003 14:04:19.587717 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cvmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nswpx_openshift-marketplace(727c69c5-0eaa-4dba-b5a8-131486e3636e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:04:19 crc kubenswrapper[4636]: E1003 14:04:19.588958 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nswpx" podUID="727c69c5-0eaa-4dba-b5a8-131486e3636e" Oct 03 14:04:20 crc kubenswrapper[4636]: E1003 14:04:20.349701 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 14:04:20 crc kubenswrapper[4636]: E1003 14:04:20.349838 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9ppfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p7txd_openshift-marketplace(29d1406c-66ee-4666-9627-e62af43b4f3d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 14:04:20 crc kubenswrapper[4636]: E1003 14:04:20.350998 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p7txd" podUID="29d1406c-66ee-4666-9627-e62af43b4f3d" Oct 03 14:04:20 crc kubenswrapper[4636]: I1003 14:04:20.493064 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" event={"ID":"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd","Type":"ContainerStarted","Data":"43e01cf1239bbcedbc84c684862290d73426af3cf1397ba93088bafc513bd6cb"} Oct 03 14:04:20 crc kubenswrapper[4636]: E1003 14:04:20.495012 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nswpx" podUID="727c69c5-0eaa-4dba-b5a8-131486e3636e" Oct 03 14:04:20 crc kubenswrapper[4636]: E1003 14:04:20.495045 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p7txd" podUID="29d1406c-66ee-4666-9627-e62af43b4f3d" Oct 03 14:04:20 crc kubenswrapper[4636]: I1003 14:04:20.496569 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:04:20 crc kubenswrapper[4636]: I1003 14:04:20.496623 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:04:21 crc kubenswrapper[4636]: I1003 14:04:21.498411 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vm9z7" event={"ID":"a7f8fb91-fbef-43b5-b771-f376cfbb1cdd","Type":"ContainerStarted","Data":"e2d16b7c341ff80bb2b2b6f71d6ecbc3673ded982b15093f6d190ca0cdd9e3ea"} Oct 03 14:04:22 crc kubenswrapper[4636]: I1003 14:04:22.522461 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vm9z7" podStartSLOduration=190.522445295 podStartE2EDuration="3m10.522445295s" podCreationTimestamp="2025-10-03 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:04:22.520486894 +0000 UTC m=+212.379213141" watchObservedRunningTime="2025-10-03 14:04:22.522445295 +0000 UTC m=+212.381171532" Oct 03 14:04:22 crc kubenswrapper[4636]: I1003 14:04:22.997726 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:04:22 crc kubenswrapper[4636]: I1003 14:04:22.997949 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:04:22 crc kubenswrapper[4636]: I1003 14:04:22.997726 4636 patch_prober.go:28] interesting pod/downloads-7954f5f757-5q7j6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 14:04:22 crc kubenswrapper[4636]: I1003 14:04:22.998020 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5q7j6" podUID="40e1f28c-6d64-4fa1-b554-507ff389f115" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 14:04:33 crc kubenswrapper[4636]: I1003 14:04:33.002662 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5q7j6" Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.591411 4636 generic.go:334] "Generic (PLEG): container finished" podID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" containerID="fedfe74661c83271b096815364d09726924ec85ee2239e0baf5c54be467505bb" exitCode=0 Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.591479 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q4hj" event={"ID":"3c76d8db-b385-49b1-b8cf-f7286f3e49c2","Type":"ContainerDied","Data":"fedfe74661c83271b096815364d09726924ec85ee2239e0baf5c54be467505bb"} Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.595438 4636 generic.go:334] "Generic (PLEG): container finished" podID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" containerID="c6e3d64fa9693487e1f16da2a76c539800163ef8b1f8672c6e40b5d1e5f010ac" exitCode=0 Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.595462 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp525" event={"ID":"9a9d47ea-2f67-4930-b870-cb6a68815b0f","Type":"ContainerDied","Data":"c6e3d64fa9693487e1f16da2a76c539800163ef8b1f8672c6e40b5d1e5f010ac"} Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.597947 4636 generic.go:334] "Generic (PLEG): container finished" podID="6040820f-38e4-4416-8648-32025aee8fcb" containerID="85644ebf2f5e1d9d123df80960756d3d3cded380c1baa037a672094e20b782dd" exitCode=0 Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.598005 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82bmp" event={"ID":"6040820f-38e4-4416-8648-32025aee8fcb","Type":"ContainerDied","Data":"85644ebf2f5e1d9d123df80960756d3d3cded380c1baa037a672094e20b782dd"} Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.600389 4636 generic.go:334] "Generic (PLEG): container finished" podID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" containerID="e766a126e48e1e4d6914ac651a22788b00a8bbbd75432e4674650dc5751ea177" exitCode=0 Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.600495 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzzbm" event={"ID":"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34","Type":"ContainerDied","Data":"e766a126e48e1e4d6914ac651a22788b00a8bbbd75432e4674650dc5751ea177"} Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.603185 4636 generic.go:334] "Generic (PLEG): container finished" podID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerID="9746599b326bf61af86b5368377f20a854c8b256877cbf9a6118ba22cd78c070" exitCode=0 Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.603283 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlscj" event={"ID":"d9d200ad-14e6-46da-bc1c-a600294e8600","Type":"ContainerDied","Data":"9746599b326bf61af86b5368377f20a854c8b256877cbf9a6118ba22cd78c070"} Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.605321 4636 generic.go:334] "Generic (PLEG): container finished" podID="29d1406c-66ee-4666-9627-e62af43b4f3d" containerID="e9550686548a68f686a24176919b69e1e5e977105d0ccc0749edb21e4c30460d" exitCode=0 Oct 03 14:04:36 crc kubenswrapper[4636]: I1003 14:04:36.605350 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7txd" event={"ID":"29d1406c-66ee-4666-9627-e62af43b4f3d","Type":"ContainerDied","Data":"e9550686548a68f686a24176919b69e1e5e977105d0ccc0749edb21e4c30460d"} Oct 03 14:04:37 crc kubenswrapper[4636]: I1003 14:04:37.614608 4636 generic.go:334] "Generic (PLEG): container finished" podID="727c69c5-0eaa-4dba-b5a8-131486e3636e" containerID="d1618039f1da37bef855f651d1b78119b06f78365af3d1a576d5c20e8bd261dc" exitCode=0 Oct 03 14:04:37 crc kubenswrapper[4636]: I1003 14:04:37.614659 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nswpx" event={"ID":"727c69c5-0eaa-4dba-b5a8-131486e3636e","Type":"ContainerDied","Data":"d1618039f1da37bef855f651d1b78119b06f78365af3d1a576d5c20e8bd261dc"} Oct 03 14:04:39 crc kubenswrapper[4636]: I1003 14:04:39.163066 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:04:39 crc kubenswrapper[4636]: I1003 14:04:39.163162 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:04:39 crc kubenswrapper[4636]: I1003 14:04:39.163223 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:04:39 crc kubenswrapper[4636]: I1003 14:04:39.163834 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:04:39 crc kubenswrapper[4636]: I1003 14:04:39.163890 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f" gracePeriod=600 Oct 03 14:04:40 crc kubenswrapper[4636]: I1003 14:04:40.633191 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f" exitCode=0 Oct 03 14:04:40 crc kubenswrapper[4636]: I1003 14:04:40.633355 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f"} Oct 03 14:04:43 crc kubenswrapper[4636]: I1003 14:04:43.648240 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"5c0c4ea124622f317166a6ea5cb84988a6632a919c472664da27e020c7262591"} Oct 03 14:04:43 crc kubenswrapper[4636]: I1003 14:04:43.651182 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zv2qw" event={"ID":"8cc66cc3-d679-469f-9b34-021345e9007f","Type":"ContainerStarted","Data":"c42ed1daf9e0e1edb0fce58be39b314e9c4812a34d6656b31d2624ad150a51a5"} Oct 03 14:04:43 crc kubenswrapper[4636]: I1003 14:04:43.653008 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlscj" event={"ID":"d9d200ad-14e6-46da-bc1c-a600294e8600","Type":"ContainerStarted","Data":"32a4f471cad14e1751ad24d90d23dfaab1f676fdf9bcd15b9728d0b6999b4f70"} Oct 03 14:04:44 crc kubenswrapper[4636]: I1003 14:04:44.659574 4636 generic.go:334] "Generic (PLEG): container finished" podID="8cc66cc3-d679-469f-9b34-021345e9007f" containerID="c42ed1daf9e0e1edb0fce58be39b314e9c4812a34d6656b31d2624ad150a51a5" exitCode=0 Oct 03 14:04:44 crc kubenswrapper[4636]: I1003 14:04:44.659654 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zv2qw" event={"ID":"8cc66cc3-d679-469f-9b34-021345e9007f","Type":"ContainerDied","Data":"c42ed1daf9e0e1edb0fce58be39b314e9c4812a34d6656b31d2624ad150a51a5"} Oct 03 14:04:44 crc kubenswrapper[4636]: I1003 14:04:44.697460 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wlscj" podStartSLOduration=5.346685364 podStartE2EDuration="1m20.697388899s" podCreationTimestamp="2025-10-03 14:03:24 +0000 UTC" firstStartedPulling="2025-10-03 14:03:26.928530245 +0000 UTC m=+156.787256492" lastFinishedPulling="2025-10-03 14:04:42.27923378 +0000 UTC m=+232.137960027" observedRunningTime="2025-10-03 14:04:44.693423607 +0000 UTC m=+234.552149854" watchObservedRunningTime="2025-10-03 14:04:44.697388899 +0000 UTC m=+234.556115146" Oct 03 14:04:44 crc kubenswrapper[4636]: I1003 14:04:44.831640 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:04:44 crc kubenswrapper[4636]: I1003 14:04:44.831690 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:04:45 crc kubenswrapper[4636]: I1003 14:04:45.668290 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nswpx" event={"ID":"727c69c5-0eaa-4dba-b5a8-131486e3636e","Type":"ContainerStarted","Data":"ddc3865907961ed5eedde01e42691566be06720ac2ebaf3a3c07f916af94861a"} Oct 03 14:04:45 crc kubenswrapper[4636]: I1003 14:04:45.687505 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nswpx" podStartSLOduration=5.218818728 podStartE2EDuration="1m23.687489329s" podCreationTimestamp="2025-10-03 14:03:22 +0000 UTC" firstStartedPulling="2025-10-03 14:03:25.899308313 +0000 UTC m=+155.758034560" lastFinishedPulling="2025-10-03 14:04:44.367978914 +0000 UTC m=+234.226705161" observedRunningTime="2025-10-03 14:04:45.684724268 +0000 UTC m=+235.543450515" watchObservedRunningTime="2025-10-03 14:04:45.687489329 +0000 UTC m=+235.546215576" Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.173698 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wlscj" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerName="registry-server" probeResult="failure" output=< Oct 03 14:04:46 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 14:04:46 crc kubenswrapper[4636]: > Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.675810 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q4hj" event={"ID":"3c76d8db-b385-49b1-b8cf-f7286f3e49c2","Type":"ContainerStarted","Data":"2bc71288f48749138f4364dd5760c4be53fa45b7eb21a16a8d7e5d59204251fa"} Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.679049 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82bmp" event={"ID":"6040820f-38e4-4416-8648-32025aee8fcb","Type":"ContainerStarted","Data":"b2308df50a1e1371b46703c434942a85c8133dec183aba8035792ae55a49dc54"} Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.681478 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp525" event={"ID":"9a9d47ea-2f67-4930-b870-cb6a68815b0f","Type":"ContainerStarted","Data":"2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e"} Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.683541 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzzbm" event={"ID":"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34","Type":"ContainerStarted","Data":"3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7"} Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.685832 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zv2qw" event={"ID":"8cc66cc3-d679-469f-9b34-021345e9007f","Type":"ContainerStarted","Data":"918207e92eb987cc1f203c45e5952c5b949098baed7a097e80af88ceeaa1cb51"} Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.688058 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7txd" event={"ID":"29d1406c-66ee-4666-9627-e62af43b4f3d","Type":"ContainerStarted","Data":"3b6c6f4b4b0a9020f5c8b5143e4223f2f550d043741c6dd089a09c4bbbe416eb"} Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.701233 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7q4hj" podStartSLOduration=5.489279673 podStartE2EDuration="1m26.701215128s" podCreationTimestamp="2025-10-03 14:03:20 +0000 UTC" firstStartedPulling="2025-10-03 14:03:24.688754819 +0000 UTC m=+154.547481066" lastFinishedPulling="2025-10-03 14:04:45.900690274 +0000 UTC m=+235.759416521" observedRunningTime="2025-10-03 14:04:46.697282077 +0000 UTC m=+236.556008334" watchObservedRunningTime="2025-10-03 14:04:46.701215128 +0000 UTC m=+236.559941375" Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.726894 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-82bmp" podStartSLOduration=4.5489895449999995 podStartE2EDuration="1m25.726875188s" podCreationTimestamp="2025-10-03 14:03:21 +0000 UTC" firstStartedPulling="2025-10-03 14:03:24.700634961 +0000 UTC m=+154.559361208" lastFinishedPulling="2025-10-03 14:04:45.878520594 +0000 UTC m=+235.737246851" observedRunningTime="2025-10-03 14:04:46.723255585 +0000 UTC m=+236.581981842" watchObservedRunningTime="2025-10-03 14:04:46.726875188 +0000 UTC m=+236.585601435" Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.747051 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tzzbm" podStartSLOduration=5.699282362 podStartE2EDuration="1m26.747034557s" podCreationTimestamp="2025-10-03 14:03:20 +0000 UTC" firstStartedPulling="2025-10-03 14:03:24.746196703 +0000 UTC m=+154.604922950" lastFinishedPulling="2025-10-03 14:04:45.793948898 +0000 UTC m=+235.652675145" observedRunningTime="2025-10-03 14:04:46.7463859 +0000 UTC m=+236.605112147" watchObservedRunningTime="2025-10-03 14:04:46.747034557 +0000 UTC m=+236.605760804" Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.772531 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p7txd" podStartSLOduration=6.44464225 podStartE2EDuration="1m26.772511952s" podCreationTimestamp="2025-10-03 14:03:20 +0000 UTC" firstStartedPulling="2025-10-03 14:03:24.683218067 +0000 UTC m=+154.541944314" lastFinishedPulling="2025-10-03 14:04:45.011087769 +0000 UTC m=+234.869814016" observedRunningTime="2025-10-03 14:04:46.771003984 +0000 UTC m=+236.629730231" watchObservedRunningTime="2025-10-03 14:04:46.772511952 +0000 UTC m=+236.631238189" Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.804952 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zv2qw" podStartSLOduration=2.567327139 podStartE2EDuration="1m22.804935757s" podCreationTimestamp="2025-10-03 14:03:24 +0000 UTC" firstStartedPulling="2025-10-03 14:03:25.817329114 +0000 UTC m=+155.676055361" lastFinishedPulling="2025-10-03 14:04:46.054937732 +0000 UTC m=+235.913663979" observedRunningTime="2025-10-03 14:04:46.800448081 +0000 UTC m=+236.659174328" watchObservedRunningTime="2025-10-03 14:04:46.804935757 +0000 UTC m=+236.663662004" Oct 03 14:04:46 crc kubenswrapper[4636]: I1003 14:04:46.827579 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bp525" podStartSLOduration=5.466503353 podStartE2EDuration="1m24.827560949s" podCreationTimestamp="2025-10-03 14:03:22 +0000 UTC" firstStartedPulling="2025-10-03 14:03:25.80817098 +0000 UTC m=+155.666897227" lastFinishedPulling="2025-10-03 14:04:45.169228576 +0000 UTC m=+235.027954823" observedRunningTime="2025-10-03 14:04:46.825562297 +0000 UTC m=+236.684288544" watchObservedRunningTime="2025-10-03 14:04:46.827560949 +0000 UTC m=+236.686287196" Oct 03 14:04:50 crc kubenswrapper[4636]: I1003 14:04:50.824150 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:04:50 crc kubenswrapper[4636]: I1003 14:04:50.824712 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:04:50 crc kubenswrapper[4636]: I1003 14:04:50.991964 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.265213 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.265492 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.310328 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.326944 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.326980 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.366450 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.525323 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.525373 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.565509 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.750923 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.759119 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.759180 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:04:51 crc kubenswrapper[4636]: I1003 14:04:51.759571 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:04:52 crc kubenswrapper[4636]: I1003 14:04:52.289168 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7txd"] Oct 03 14:04:53 crc kubenswrapper[4636]: I1003 14:04:53.176124 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:04:53 crc kubenswrapper[4636]: I1003 14:04:53.176484 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:04:53 crc kubenswrapper[4636]: I1003 14:04:53.219516 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:04:53 crc kubenswrapper[4636]: I1003 14:04:53.266282 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:04:53 crc kubenswrapper[4636]: I1003 14:04:53.266338 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:04:53 crc kubenswrapper[4636]: I1003 14:04:53.300192 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:04:53 crc kubenswrapper[4636]: I1003 14:04:53.689581 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-82bmp"] Oct 03 14:04:53 crc kubenswrapper[4636]: I1003 14:04:53.725733 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-82bmp" podUID="6040820f-38e4-4416-8648-32025aee8fcb" containerName="registry-server" containerID="cri-o://b2308df50a1e1371b46703c434942a85c8133dec183aba8035792ae55a49dc54" gracePeriod=2 Oct 03 14:04:53 crc kubenswrapper[4636]: I1003 14:04:53.727179 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p7txd" podUID="29d1406c-66ee-4666-9627-e62af43b4f3d" containerName="registry-server" containerID="cri-o://3b6c6f4b4b0a9020f5c8b5143e4223f2f550d043741c6dd089a09c4bbbe416eb" gracePeriod=2 Oct 03 14:04:53 crc kubenswrapper[4636]: I1003 14:04:53.766194 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:04:53 crc kubenswrapper[4636]: I1003 14:04:53.768767 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.531799 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.531849 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.576512 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.731669 4636 generic.go:334] "Generic (PLEG): container finished" podID="29d1406c-66ee-4666-9627-e62af43b4f3d" containerID="3b6c6f4b4b0a9020f5c8b5143e4223f2f550d043741c6dd089a09c4bbbe416eb" exitCode=0 Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.731878 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7txd" event={"ID":"29d1406c-66ee-4666-9627-e62af43b4f3d","Type":"ContainerDied","Data":"3b6c6f4b4b0a9020f5c8b5143e4223f2f550d043741c6dd089a09c4bbbe416eb"} Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.732040 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7txd" event={"ID":"29d1406c-66ee-4666-9627-e62af43b4f3d","Type":"ContainerDied","Data":"42fde5472f6366027d07d257c7142e66a6427f94635bf083d2977334020f6df9"} Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.732060 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42fde5472f6366027d07d257c7142e66a6427f94635bf083d2977334020f6df9" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.743153 4636 generic.go:334] "Generic (PLEG): container finished" podID="6040820f-38e4-4416-8648-32025aee8fcb" containerID="b2308df50a1e1371b46703c434942a85c8133dec183aba8035792ae55a49dc54" exitCode=0 Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.743372 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82bmp" event={"ID":"6040820f-38e4-4416-8648-32025aee8fcb","Type":"ContainerDied","Data":"b2308df50a1e1371b46703c434942a85c8133dec183aba8035792ae55a49dc54"} Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.743409 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82bmp" event={"ID":"6040820f-38e4-4416-8648-32025aee8fcb","Type":"ContainerDied","Data":"a4a47b065f9963273aeccab5feefd92aff40e1efcb58123e818cfa1d43c7ae9f"} Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.743425 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4a47b065f9963273aeccab5feefd92aff40e1efcb58123e818cfa1d43c7ae9f" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.756395 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.766519 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.797086 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.865678 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ppfb\" (UniqueName: \"kubernetes.io/projected/29d1406c-66ee-4666-9627-e62af43b4f3d-kube-api-access-9ppfb\") pod \"29d1406c-66ee-4666-9627-e62af43b4f3d\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.865761 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-utilities\") pod \"29d1406c-66ee-4666-9627-e62af43b4f3d\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.865815 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-catalog-content\") pod \"29d1406c-66ee-4666-9627-e62af43b4f3d\" (UID: \"29d1406c-66ee-4666-9627-e62af43b4f3d\") " Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.865850 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-catalog-content\") pod \"6040820f-38e4-4416-8648-32025aee8fcb\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.865887 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-utilities\") pod \"6040820f-38e4-4416-8648-32025aee8fcb\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.865922 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2gsw\" (UniqueName: \"kubernetes.io/projected/6040820f-38e4-4416-8648-32025aee8fcb-kube-api-access-f2gsw\") pod \"6040820f-38e4-4416-8648-32025aee8fcb\" (UID: \"6040820f-38e4-4416-8648-32025aee8fcb\") " Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.867798 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-utilities" (OuterVolumeSpecName: "utilities") pod "29d1406c-66ee-4666-9627-e62af43b4f3d" (UID: "29d1406c-66ee-4666-9627-e62af43b4f3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.872348 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6040820f-38e4-4416-8648-32025aee8fcb-kube-api-access-f2gsw" (OuterVolumeSpecName: "kube-api-access-f2gsw") pod "6040820f-38e4-4416-8648-32025aee8fcb" (UID: "6040820f-38e4-4416-8648-32025aee8fcb"). InnerVolumeSpecName "kube-api-access-f2gsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.875542 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d1406c-66ee-4666-9627-e62af43b4f3d-kube-api-access-9ppfb" (OuterVolumeSpecName: "kube-api-access-9ppfb") pod "29d1406c-66ee-4666-9627-e62af43b4f3d" (UID: "29d1406c-66ee-4666-9627-e62af43b4f3d"). InnerVolumeSpecName "kube-api-access-9ppfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.875704 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.885414 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-utilities" (OuterVolumeSpecName: "utilities") pod "6040820f-38e4-4416-8648-32025aee8fcb" (UID: "6040820f-38e4-4416-8648-32025aee8fcb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.919310 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.933297 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29d1406c-66ee-4666-9627-e62af43b4f3d" (UID: "29d1406c-66ee-4666-9627-e62af43b4f3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.944616 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6040820f-38e4-4416-8648-32025aee8fcb" (UID: "6040820f-38e4-4416-8648-32025aee8fcb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.968128 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.968163 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.968173 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6040820f-38e4-4416-8648-32025aee8fcb-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.968228 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2gsw\" (UniqueName: \"kubernetes.io/projected/6040820f-38e4-4416-8648-32025aee8fcb-kube-api-access-f2gsw\") on node \"crc\" DevicePath \"\"" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.968240 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ppfb\" (UniqueName: \"kubernetes.io/projected/29d1406c-66ee-4666-9627-e62af43b4f3d-kube-api-access-9ppfb\") on node \"crc\" DevicePath \"\"" Oct 03 14:04:54 crc kubenswrapper[4636]: I1003 14:04:54.968248 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29d1406c-66ee-4666-9627-e62af43b4f3d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:04:55 crc kubenswrapper[4636]: I1003 14:04:55.747222 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82bmp" Oct 03 14:04:55 crc kubenswrapper[4636]: I1003 14:04:55.747286 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7txd" Oct 03 14:04:55 crc kubenswrapper[4636]: I1003 14:04:55.775203 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7txd"] Oct 03 14:04:55 crc kubenswrapper[4636]: I1003 14:04:55.775256 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p7txd"] Oct 03 14:04:55 crc kubenswrapper[4636]: I1003 14:04:55.789226 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-82bmp"] Oct 03 14:04:55 crc kubenswrapper[4636]: I1003 14:04:55.792082 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-82bmp"] Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.088342 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp525"] Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.088591 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bp525" podUID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" containerName="registry-server" containerID="cri-o://2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e" gracePeriod=2 Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.455952 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.588707 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-catalog-content\") pod \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.588758 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgkl7\" (UniqueName: \"kubernetes.io/projected/9a9d47ea-2f67-4930-b870-cb6a68815b0f-kube-api-access-vgkl7\") pod \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.588844 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-utilities\") pod \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\" (UID: \"9a9d47ea-2f67-4930-b870-cb6a68815b0f\") " Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.589585 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-utilities" (OuterVolumeSpecName: "utilities") pod "9a9d47ea-2f67-4930-b870-cb6a68815b0f" (UID: "9a9d47ea-2f67-4930-b870-cb6a68815b0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.601688 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a9d47ea-2f67-4930-b870-cb6a68815b0f" (UID: "9a9d47ea-2f67-4930-b870-cb6a68815b0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.604369 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a9d47ea-2f67-4930-b870-cb6a68815b0f-kube-api-access-vgkl7" (OuterVolumeSpecName: "kube-api-access-vgkl7") pod "9a9d47ea-2f67-4930-b870-cb6a68815b0f" (UID: "9a9d47ea-2f67-4930-b870-cb6a68815b0f"). InnerVolumeSpecName "kube-api-access-vgkl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.691671 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.691709 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9d47ea-2f67-4930-b870-cb6a68815b0f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.691724 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgkl7\" (UniqueName: \"kubernetes.io/projected/9a9d47ea-2f67-4930-b870-cb6a68815b0f-kube-api-access-vgkl7\") on node \"crc\" DevicePath \"\"" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.753019 4636 generic.go:334] "Generic (PLEG): container finished" podID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" containerID="2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e" exitCode=0 Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.753061 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp525" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.753077 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp525" event={"ID":"9a9d47ea-2f67-4930-b870-cb6a68815b0f","Type":"ContainerDied","Data":"2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e"} Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.753155 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp525" event={"ID":"9a9d47ea-2f67-4930-b870-cb6a68815b0f","Type":"ContainerDied","Data":"679832bac42d7c26bf4ebdc2a087546a858007d742bb12d67a58e1abe07a51f7"} Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.753193 4636 scope.go:117] "RemoveContainer" containerID="2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.773334 4636 scope.go:117] "RemoveContainer" containerID="c6e3d64fa9693487e1f16da2a76c539800163ef8b1f8672c6e40b5d1e5f010ac" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.785841 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp525"] Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.788688 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp525"] Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.794735 4636 scope.go:117] "RemoveContainer" containerID="5043dc50950d669fa4c7c93a4109a8c0496681504103306556ebc2d5dc9f6451" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.803829 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d1406c-66ee-4666-9627-e62af43b4f3d" path="/var/lib/kubelet/pods/29d1406c-66ee-4666-9627-e62af43b4f3d/volumes" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.804976 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6040820f-38e4-4416-8648-32025aee8fcb" path="/var/lib/kubelet/pods/6040820f-38e4-4416-8648-32025aee8fcb/volumes" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.805771 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" path="/var/lib/kubelet/pods/9a9d47ea-2f67-4930-b870-cb6a68815b0f/volumes" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.811898 4636 scope.go:117] "RemoveContainer" containerID="2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e" Oct 03 14:04:56 crc kubenswrapper[4636]: E1003 14:04:56.812357 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e\": container with ID starting with 2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e not found: ID does not exist" containerID="2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.812407 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e"} err="failed to get container status \"2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e\": rpc error: code = NotFound desc = could not find container \"2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e\": container with ID starting with 2c26c8114dcd1b2fb5a4b07e39bc407b35bc47992f8fe918a333fd9f1639750e not found: ID does not exist" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.812434 4636 scope.go:117] "RemoveContainer" containerID="c6e3d64fa9693487e1f16da2a76c539800163ef8b1f8672c6e40b5d1e5f010ac" Oct 03 14:04:56 crc kubenswrapper[4636]: E1003 14:04:56.812817 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e3d64fa9693487e1f16da2a76c539800163ef8b1f8672c6e40b5d1e5f010ac\": container with ID starting with c6e3d64fa9693487e1f16da2a76c539800163ef8b1f8672c6e40b5d1e5f010ac not found: ID does not exist" containerID="c6e3d64fa9693487e1f16da2a76c539800163ef8b1f8672c6e40b5d1e5f010ac" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.812854 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e3d64fa9693487e1f16da2a76c539800163ef8b1f8672c6e40b5d1e5f010ac"} err="failed to get container status \"c6e3d64fa9693487e1f16da2a76c539800163ef8b1f8672c6e40b5d1e5f010ac\": rpc error: code = NotFound desc = could not find container \"c6e3d64fa9693487e1f16da2a76c539800163ef8b1f8672c6e40b5d1e5f010ac\": container with ID starting with c6e3d64fa9693487e1f16da2a76c539800163ef8b1f8672c6e40b5d1e5f010ac not found: ID does not exist" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.812881 4636 scope.go:117] "RemoveContainer" containerID="5043dc50950d669fa4c7c93a4109a8c0496681504103306556ebc2d5dc9f6451" Oct 03 14:04:56 crc kubenswrapper[4636]: E1003 14:04:56.813234 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5043dc50950d669fa4c7c93a4109a8c0496681504103306556ebc2d5dc9f6451\": container with ID starting with 5043dc50950d669fa4c7c93a4109a8c0496681504103306556ebc2d5dc9f6451 not found: ID does not exist" containerID="5043dc50950d669fa4c7c93a4109a8c0496681504103306556ebc2d5dc9f6451" Oct 03 14:04:56 crc kubenswrapper[4636]: I1003 14:04:56.813303 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5043dc50950d669fa4c7c93a4109a8c0496681504103306556ebc2d5dc9f6451"} err="failed to get container status \"5043dc50950d669fa4c7c93a4109a8c0496681504103306556ebc2d5dc9f6451\": rpc error: code = NotFound desc = could not find container \"5043dc50950d669fa4c7c93a4109a8c0496681504103306556ebc2d5dc9f6451\": container with ID starting with 5043dc50950d669fa4c7c93a4109a8c0496681504103306556ebc2d5dc9f6451 not found: ID does not exist" Oct 03 14:04:58 crc kubenswrapper[4636]: I1003 14:04:58.378861 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srw4g"] Oct 03 14:04:58 crc kubenswrapper[4636]: I1003 14:04:58.487273 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlscj"] Oct 03 14:04:58 crc kubenswrapper[4636]: I1003 14:04:58.487498 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wlscj" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerName="registry-server" containerID="cri-o://32a4f471cad14e1751ad24d90d23dfaab1f676fdf9bcd15b9728d0b6999b4f70" gracePeriod=2 Oct 03 14:04:59 crc kubenswrapper[4636]: I1003 14:04:59.784648 4636 generic.go:334] "Generic (PLEG): container finished" podID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerID="32a4f471cad14e1751ad24d90d23dfaab1f676fdf9bcd15b9728d0b6999b4f70" exitCode=0 Oct 03 14:04:59 crc kubenswrapper[4636]: I1003 14:04:59.784848 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlscj" event={"ID":"d9d200ad-14e6-46da-bc1c-a600294e8600","Type":"ContainerDied","Data":"32a4f471cad14e1751ad24d90d23dfaab1f676fdf9bcd15b9728d0b6999b4f70"} Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.053859 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.233652 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-catalog-content\") pod \"d9d200ad-14e6-46da-bc1c-a600294e8600\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.241633 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzx8n\" (UniqueName: \"kubernetes.io/projected/d9d200ad-14e6-46da-bc1c-a600294e8600-kube-api-access-vzx8n\") pod \"d9d200ad-14e6-46da-bc1c-a600294e8600\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.241730 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-utilities\") pod \"d9d200ad-14e6-46da-bc1c-a600294e8600\" (UID: \"d9d200ad-14e6-46da-bc1c-a600294e8600\") " Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.242464 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-utilities" (OuterVolumeSpecName: "utilities") pod "d9d200ad-14e6-46da-bc1c-a600294e8600" (UID: "d9d200ad-14e6-46da-bc1c-a600294e8600"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.247145 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d200ad-14e6-46da-bc1c-a600294e8600-kube-api-access-vzx8n" (OuterVolumeSpecName: "kube-api-access-vzx8n") pod "d9d200ad-14e6-46da-bc1c-a600294e8600" (UID: "d9d200ad-14e6-46da-bc1c-a600294e8600"). InnerVolumeSpecName "kube-api-access-vzx8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.323820 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9d200ad-14e6-46da-bc1c-a600294e8600" (UID: "d9d200ad-14e6-46da-bc1c-a600294e8600"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.343277 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.343315 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzx8n\" (UniqueName: \"kubernetes.io/projected/d9d200ad-14e6-46da-bc1c-a600294e8600-kube-api-access-vzx8n\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.343329 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d200ad-14e6-46da-bc1c-a600294e8600-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.791674 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlscj" event={"ID":"d9d200ad-14e6-46da-bc1c-a600294e8600","Type":"ContainerDied","Data":"08138f5b364a95920a177bfa764127c3484c2ba021b5bb725872a11f229853fe"} Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.791714 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlscj" Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.791746 4636 scope.go:117] "RemoveContainer" containerID="32a4f471cad14e1751ad24d90d23dfaab1f676fdf9bcd15b9728d0b6999b4f70" Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.813355 4636 scope.go:117] "RemoveContainer" containerID="9746599b326bf61af86b5368377f20a854c8b256877cbf9a6118ba22cd78c070" Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.823585 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlscj"] Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.826869 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wlscj"] Oct 03 14:05:00 crc kubenswrapper[4636]: I1003 14:05:00.832883 4636 scope.go:117] "RemoveContainer" containerID="002be0d0f665cbc1af4178597b1dca06a7885c1ff320ad53c77cf46f394b1db8" Oct 03 14:05:02 crc kubenswrapper[4636]: I1003 14:05:02.804039 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" path="/var/lib/kubelet/pods/d9d200ad-14e6-46da-bc1c-a600294e8600/volumes" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.403491 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" podUID="010ef4ac-9542-4a76-a005-385439b1045c" containerName="oauth-openshift" containerID="cri-o://43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f" gracePeriod=15 Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.765325 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.793921 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-764f9b7cd5-w7l49"] Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794227 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d1406c-66ee-4666-9627-e62af43b4f3d" containerName="extract-utilities" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794254 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d1406c-66ee-4666-9627-e62af43b4f3d" containerName="extract-utilities" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794265 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d1406c-66ee-4666-9627-e62af43b4f3d" containerName="extract-content" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794272 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d1406c-66ee-4666-9627-e62af43b4f3d" containerName="extract-content" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794282 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="010ef4ac-9542-4a76-a005-385439b1045c" containerName="oauth-openshift" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794288 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="010ef4ac-9542-4a76-a005-385439b1045c" containerName="oauth-openshift" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794298 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794303 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794311 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" containerName="extract-utilities" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794317 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" containerName="extract-utilities" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794325 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61e0cf54-1f42-4b7d-9787-db8e0cf348aa" containerName="pruner" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794330 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e0cf54-1f42-4b7d-9787-db8e0cf348aa" containerName="pruner" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794337 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6040820f-38e4-4416-8648-32025aee8fcb" containerName="extract-content" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794343 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6040820f-38e4-4416-8648-32025aee8fcb" containerName="extract-content" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794352 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d1406c-66ee-4666-9627-e62af43b4f3d" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794357 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d1406c-66ee-4666-9627-e62af43b4f3d" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794364 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerName="extract-content" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794369 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerName="extract-content" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794378 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794384 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794392 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6040820f-38e4-4416-8648-32025aee8fcb" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794397 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6040820f-38e4-4416-8648-32025aee8fcb" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794406 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95979078-756b-4028-9c35-bc46d056eade" containerName="pruner" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794412 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="95979078-756b-4028-9c35-bc46d056eade" containerName="pruner" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794420 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" containerName="extract-content" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794426 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" containerName="extract-content" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794432 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerName="extract-utilities" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794439 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerName="extract-utilities" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.794448 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6040820f-38e4-4416-8648-32025aee8fcb" containerName="extract-utilities" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794453 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6040820f-38e4-4416-8648-32025aee8fcb" containerName="extract-utilities" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794537 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a9d47ea-2f67-4930-b870-cb6a68815b0f" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794549 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d200ad-14e6-46da-bc1c-a600294e8600" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794559 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="010ef4ac-9542-4a76-a005-385439b1045c" containerName="oauth-openshift" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794565 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="61e0cf54-1f42-4b7d-9787-db8e0cf348aa" containerName="pruner" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794571 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="6040820f-38e4-4416-8648-32025aee8fcb" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794581 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d1406c-66ee-4666-9627-e62af43b4f3d" containerName="registry-server" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794589 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="95979078-756b-4028-9c35-bc46d056eade" containerName="pruner" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.794867 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820476 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820530 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-serving-cert\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820557 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820585 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-session\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820607 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-template-error\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820637 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-audit-policies\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820660 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-router-certs\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820683 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-template-login\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820706 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820728 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-service-ca\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820800 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8afff867-5d84-4da6-a867-d5b636fa3483-audit-dir\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820822 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820844 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc56g\" (UniqueName: \"kubernetes.io/projected/8afff867-5d84-4da6-a867-d5b636fa3483-kube-api-access-sc56g\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.820875 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-cliconfig\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.852900 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-764f9b7cd5-w7l49"] Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.904438 4636 generic.go:334] "Generic (PLEG): container finished" podID="010ef4ac-9542-4a76-a005-385439b1045c" containerID="43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f" exitCode=0 Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.904474 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" event={"ID":"010ef4ac-9542-4a76-a005-385439b1045c","Type":"ContainerDied","Data":"43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f"} Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.904498 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" event={"ID":"010ef4ac-9542-4a76-a005-385439b1045c","Type":"ContainerDied","Data":"4a6ead0a815cb8d4c734183f26d139625d9bca641df8471374c25a41fe039e01"} Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.904514 4636 scope.go:117] "RemoveContainer" containerID="43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.904610 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srw4g" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.921859 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-ocp-branding-template\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.921900 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/010ef4ac-9542-4a76-a005-385439b1045c-audit-dir\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.921944 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-session\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.921963 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-error\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.921989 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-login\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922020 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-provider-selection\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922055 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-serving-cert\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922073 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-idp-0-file-data\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922088 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-service-ca\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922124 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-router-certs\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922142 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2kjc\" (UniqueName: \"kubernetes.io/projected/010ef4ac-9542-4a76-a005-385439b1045c-kube-api-access-z2kjc\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922158 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-audit-policies\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922174 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-trusted-ca-bundle\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922197 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-cliconfig\") pod \"010ef4ac-9542-4a76-a005-385439b1045c\" (UID: \"010ef4ac-9542-4a76-a005-385439b1045c\") " Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922283 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922308 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-serving-cert\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922324 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922343 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-session\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922357 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-template-error\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922376 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-audit-policies\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922391 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-router-certs\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922406 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-template-login\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922422 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922438 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-service-ca\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922468 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8afff867-5d84-4da6-a867-d5b636fa3483-audit-dir\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922484 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922498 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc56g\" (UniqueName: \"kubernetes.io/projected/8afff867-5d84-4da6-a867-d5b636fa3483-kube-api-access-sc56g\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922520 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-cliconfig\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.922904 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/010ef4ac-9542-4a76-a005-385439b1045c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.923370 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-cliconfig\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.926272 4636 scope.go:117] "RemoveContainer" containerID="43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f" Oct 03 14:05:23 crc kubenswrapper[4636]: E1003 14:05:23.929519 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f\": container with ID starting with 43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f not found: ID does not exist" containerID="43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.929563 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f"} err="failed to get container status \"43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f\": rpc error: code = NotFound desc = could not find container \"43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f\": container with ID starting with 43bdcb2f8b7da8f7ea02fda5c514156bd0e582963636d28f5fab7f7593fa075f not found: ID does not exist" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.931960 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-serving-cert\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.932249 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-template-error\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.933221 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.933678 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.935321 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.936038 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.936147 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-audit-policies\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.936532 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.936747 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-session\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.936815 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8afff867-5d84-4da6-a867-d5b636fa3483-audit-dir\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.937085 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.937381 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-template-login\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.937508 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.937839 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.938599 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-service-ca\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.939914 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.940436 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/010ef4ac-9542-4a76-a005-385439b1045c-kube-api-access-z2kjc" (OuterVolumeSpecName: "kube-api-access-z2kjc") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "kube-api-access-z2kjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.942920 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.943408 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.944698 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8afff867-5d84-4da6-a867-d5b636fa3483-v4-0-config-system-router-certs\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.946876 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.948440 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.953424 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.954428 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.955984 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "010ef4ac-9542-4a76-a005-385439b1045c" (UID: "010ef4ac-9542-4a76-a005-385439b1045c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:05:23 crc kubenswrapper[4636]: I1003 14:05:23.962907 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc56g\" (UniqueName: \"kubernetes.io/projected/8afff867-5d84-4da6-a867-d5b636fa3483-kube-api-access-sc56g\") pod \"oauth-openshift-764f9b7cd5-w7l49\" (UID: \"8afff867-5d84-4da6-a867-d5b636fa3483\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.022976 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023008 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023017 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023027 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023036 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2kjc\" (UniqueName: \"kubernetes.io/projected/010ef4ac-9542-4a76-a005-385439b1045c-kube-api-access-z2kjc\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023045 4636 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023053 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023061 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023069 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023077 4636 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/010ef4ac-9542-4a76-a005-385439b1045c-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023086 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023107 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023115 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.023123 4636 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/010ef4ac-9542-4a76-a005-385439b1045c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.112866 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.245174 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srw4g"] Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.247537 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srw4g"] Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.505849 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-764f9b7cd5-w7l49"] Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.799908 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="010ef4ac-9542-4a76-a005-385439b1045c" path="/var/lib/kubelet/pods/010ef4ac-9542-4a76-a005-385439b1045c/volumes" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.911662 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" event={"ID":"8afff867-5d84-4da6-a867-d5b636fa3483","Type":"ContainerStarted","Data":"51c419c3145e6eb551aa6fe6f6b743e1936b765d5df22d0dc2af852b3f81aca2"} Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.911716 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" event={"ID":"8afff867-5d84-4da6-a867-d5b636fa3483","Type":"ContainerStarted","Data":"7c7b4d1345dc8c9bf148613e4714bde341308f17163dc68f406145e7c40f748c"} Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.911901 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:24 crc kubenswrapper[4636]: I1003 14:05:24.945652 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" podStartSLOduration=26.945627312 podStartE2EDuration="26.945627312s" podCreationTimestamp="2025-10-03 14:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:05:24.939827455 +0000 UTC m=+274.798553792" watchObservedRunningTime="2025-10-03 14:05:24.945627312 +0000 UTC m=+274.804353589" Oct 03 14:05:25 crc kubenswrapper[4636]: I1003 14:05:25.127331 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-764f9b7cd5-w7l49" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.356455 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tzzbm"] Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.357842 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tzzbm" podUID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" containerName="registry-server" containerID="cri-o://3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7" gracePeriod=30 Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.361965 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7q4hj"] Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.362229 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7q4hj" podUID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" containerName="registry-server" containerID="cri-o://2bc71288f48749138f4364dd5760c4be53fa45b7eb21a16a8d7e5d59204251fa" gracePeriod=30 Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.379502 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rjp5j"] Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.379733 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" podUID="bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed" containerName="marketplace-operator" containerID="cri-o://aadee51d005ae8bf2c42a2dd7725e0294b6f25c145eedff34843e2748e4177de" gracePeriod=30 Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.391458 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nswpx"] Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.391959 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nswpx" podUID="727c69c5-0eaa-4dba-b5a8-131486e3636e" containerName="registry-server" containerID="cri-o://ddc3865907961ed5eedde01e42691566be06720ac2ebaf3a3c07f916af94861a" gracePeriod=30 Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.400026 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zv2qw"] Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.400317 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zv2qw" podUID="8cc66cc3-d679-469f-9b34-021345e9007f" containerName="registry-server" containerID="cri-o://918207e92eb987cc1f203c45e5952c5b949098baed7a097e80af88ceeaa1cb51" gracePeriod=30 Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.427251 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ncqfg"] Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.439255 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.450647 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ncqfg"] Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.599880 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r8k5\" (UniqueName: \"kubernetes.io/projected/eb4639ab-5b3c-4f36-9c1e-077930e571e3-kube-api-access-6r8k5\") pod \"marketplace-operator-79b997595-ncqfg\" (UID: \"eb4639ab-5b3c-4f36-9c1e-077930e571e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.599957 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eb4639ab-5b3c-4f36-9c1e-077930e571e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ncqfg\" (UID: \"eb4639ab-5b3c-4f36-9c1e-077930e571e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.600139 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb4639ab-5b3c-4f36-9c1e-077930e571e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ncqfg\" (UID: \"eb4639ab-5b3c-4f36-9c1e-077930e571e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.702060 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eb4639ab-5b3c-4f36-9c1e-077930e571e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ncqfg\" (UID: \"eb4639ab-5b3c-4f36-9c1e-077930e571e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.702165 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb4639ab-5b3c-4f36-9c1e-077930e571e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ncqfg\" (UID: \"eb4639ab-5b3c-4f36-9c1e-077930e571e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.702254 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r8k5\" (UniqueName: \"kubernetes.io/projected/eb4639ab-5b3c-4f36-9c1e-077930e571e3-kube-api-access-6r8k5\") pod \"marketplace-operator-79b997595-ncqfg\" (UID: \"eb4639ab-5b3c-4f36-9c1e-077930e571e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.704794 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb4639ab-5b3c-4f36-9c1e-077930e571e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ncqfg\" (UID: \"eb4639ab-5b3c-4f36-9c1e-077930e571e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.709781 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eb4639ab-5b3c-4f36-9c1e-077930e571e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ncqfg\" (UID: \"eb4639ab-5b3c-4f36-9c1e-077930e571e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.726519 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r8k5\" (UniqueName: \"kubernetes.io/projected/eb4639ab-5b3c-4f36-9c1e-077930e571e3-kube-api-access-6r8k5\") pod \"marketplace-operator-79b997595-ncqfg\" (UID: \"eb4639ab-5b3c-4f36-9c1e-077930e571e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.749629 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.946830 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.975526 4636 generic.go:334] "Generic (PLEG): container finished" podID="8cc66cc3-d679-469f-9b34-021345e9007f" containerID="918207e92eb987cc1f203c45e5952c5b949098baed7a097e80af88ceeaa1cb51" exitCode=0 Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.975593 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zv2qw" event={"ID":"8cc66cc3-d679-469f-9b34-021345e9007f","Type":"ContainerDied","Data":"918207e92eb987cc1f203c45e5952c5b949098baed7a097e80af88ceeaa1cb51"} Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.976856 4636 generic.go:334] "Generic (PLEG): container finished" podID="bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed" containerID="aadee51d005ae8bf2c42a2dd7725e0294b6f25c145eedff34843e2748e4177de" exitCode=0 Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.976905 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" event={"ID":"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed","Type":"ContainerDied","Data":"aadee51d005ae8bf2c42a2dd7725e0294b6f25c145eedff34843e2748e4177de"} Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.980175 4636 generic.go:334] "Generic (PLEG): container finished" podID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" containerID="2bc71288f48749138f4364dd5760c4be53fa45b7eb21a16a8d7e5d59204251fa" exitCode=0 Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.980230 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q4hj" event={"ID":"3c76d8db-b385-49b1-b8cf-f7286f3e49c2","Type":"ContainerDied","Data":"2bc71288f48749138f4364dd5760c4be53fa45b7eb21a16a8d7e5d59204251fa"} Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.981904 4636 generic.go:334] "Generic (PLEG): container finished" podID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" containerID="3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7" exitCode=0 Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.981947 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzzbm" event={"ID":"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34","Type":"ContainerDied","Data":"3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7"} Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.981965 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzzbm" event={"ID":"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34","Type":"ContainerDied","Data":"9d2f835b2e5ffc0af843603389c31093b950af3fc45993cf76be590337013f03"} Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.981981 4636 scope.go:117] "RemoveContainer" containerID="3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.982078 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzzbm" Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.986786 4636 generic.go:334] "Generic (PLEG): container finished" podID="727c69c5-0eaa-4dba-b5a8-131486e3636e" containerID="ddc3865907961ed5eedde01e42691566be06720ac2ebaf3a3c07f916af94861a" exitCode=0 Oct 03 14:05:36 crc kubenswrapper[4636]: I1003 14:05:36.986856 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nswpx" event={"ID":"727c69c5-0eaa-4dba-b5a8-131486e3636e","Type":"ContainerDied","Data":"ddc3865907961ed5eedde01e42691566be06720ac2ebaf3a3c07f916af94861a"} Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.007871 4636 scope.go:117] "RemoveContainer" containerID="e766a126e48e1e4d6914ac651a22788b00a8bbbd75432e4674650dc5751ea177" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.043516 4636 scope.go:117] "RemoveContainer" containerID="1eb4d1f23e9db0c809a1d874809576253d8881828582bf5ab73d5b7a096f49f8" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.075370 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.093783 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.101232 4636 scope.go:117] "RemoveContainer" containerID="3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7" Oct 03 14:05:37 crc kubenswrapper[4636]: E1003 14:05:37.102721 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7\": container with ID starting with 3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7 not found: ID does not exist" containerID="3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.102759 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7"} err="failed to get container status \"3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7\": rpc error: code = NotFound desc = could not find container \"3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7\": container with ID starting with 3b9612b0f520f11a3bbaff7bac5b4b6355fef049cfe5c551021f305ef0d42ac7 not found: ID does not exist" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.102787 4636 scope.go:117] "RemoveContainer" containerID="e766a126e48e1e4d6914ac651a22788b00a8bbbd75432e4674650dc5751ea177" Oct 03 14:05:37 crc kubenswrapper[4636]: E1003 14:05:37.106495 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e766a126e48e1e4d6914ac651a22788b00a8bbbd75432e4674650dc5751ea177\": container with ID starting with e766a126e48e1e4d6914ac651a22788b00a8bbbd75432e4674650dc5751ea177 not found: ID does not exist" containerID="e766a126e48e1e4d6914ac651a22788b00a8bbbd75432e4674650dc5751ea177" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.106567 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e766a126e48e1e4d6914ac651a22788b00a8bbbd75432e4674650dc5751ea177"} err="failed to get container status \"e766a126e48e1e4d6914ac651a22788b00a8bbbd75432e4674650dc5751ea177\": rpc error: code = NotFound desc = could not find container \"e766a126e48e1e4d6914ac651a22788b00a8bbbd75432e4674650dc5751ea177\": container with ID starting with e766a126e48e1e4d6914ac651a22788b00a8bbbd75432e4674650dc5751ea177 not found: ID does not exist" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.106605 4636 scope.go:117] "RemoveContainer" containerID="1eb4d1f23e9db0c809a1d874809576253d8881828582bf5ab73d5b7a096f49f8" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.108890 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-catalog-content\") pod \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.109008 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7srxq\" (UniqueName: \"kubernetes.io/projected/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-kube-api-access-7srxq\") pod \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.114049 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-kube-api-access-7srxq" (OuterVolumeSpecName: "kube-api-access-7srxq") pod "a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" (UID: "a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34"). InnerVolumeSpecName "kube-api-access-7srxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.114240 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-utilities" (OuterVolumeSpecName: "utilities") pod "a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" (UID: "a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.114734 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-utilities\") pod \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\" (UID: \"a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.117552 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-operator-metrics\") pod \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.117639 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhpzv\" (UniqueName: \"kubernetes.io/projected/8cc66cc3-d679-469f-9b34-021345e9007f-kube-api-access-nhpzv\") pod \"8cc66cc3-d679-469f-9b34-021345e9007f\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.117675 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-utilities\") pod \"8cc66cc3-d679-469f-9b34-021345e9007f\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.118880 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-utilities" (OuterVolumeSpecName: "utilities") pod "8cc66cc3-d679-469f-9b34-021345e9007f" (UID: "8cc66cc3-d679-469f-9b34-021345e9007f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: E1003 14:05:37.121020 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb4d1f23e9db0c809a1d874809576253d8881828582bf5ab73d5b7a096f49f8\": container with ID starting with 1eb4d1f23e9db0c809a1d874809576253d8881828582bf5ab73d5b7a096f49f8 not found: ID does not exist" containerID="1eb4d1f23e9db0c809a1d874809576253d8881828582bf5ab73d5b7a096f49f8" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.121068 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb4d1f23e9db0c809a1d874809576253d8881828582bf5ab73d5b7a096f49f8"} err="failed to get container status \"1eb4d1f23e9db0c809a1d874809576253d8881828582bf5ab73d5b7a096f49f8\": rpc error: code = NotFound desc = could not find container \"1eb4d1f23e9db0c809a1d874809576253d8881828582bf5ab73d5b7a096f49f8\": container with ID starting with 1eb4d1f23e9db0c809a1d874809576253d8881828582bf5ab73d5b7a096f49f8 not found: ID does not exist" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.121878 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-catalog-content\") pod \"8cc66cc3-d679-469f-9b34-021345e9007f\" (UID: \"8cc66cc3-d679-469f-9b34-021345e9007f\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.122999 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc66cc3-d679-469f-9b34-021345e9007f-kube-api-access-nhpzv" (OuterVolumeSpecName: "kube-api-access-nhpzv") pod "8cc66cc3-d679-469f-9b34-021345e9007f" (UID: "8cc66cc3-d679-469f-9b34-021345e9007f"). InnerVolumeSpecName "kube-api-access-nhpzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.126166 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed" (UID: "bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.126448 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7srxq\" (UniqueName: \"kubernetes.io/projected/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-kube-api-access-7srxq\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.126462 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.126472 4636 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.126480 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhpzv\" (UniqueName: \"kubernetes.io/projected/8cc66cc3-d679-469f-9b34-021345e9007f-kube-api-access-nhpzv\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.126490 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.127552 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.154959 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.212572 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" (UID: "a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.225616 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cc66cc3-d679-469f-9b34-021345e9007f" (UID: "8cc66cc3-d679-469f-9b34-021345e9007f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.227257 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-utilities\") pod \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.227326 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shpqx\" (UniqueName: \"kubernetes.io/projected/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-kube-api-access-shpqx\") pod \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.227345 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cvmt\" (UniqueName: \"kubernetes.io/projected/727c69c5-0eaa-4dba-b5a8-131486e3636e-kube-api-access-4cvmt\") pod \"727c69c5-0eaa-4dba-b5a8-131486e3636e\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.227364 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-catalog-content\") pod \"727c69c5-0eaa-4dba-b5a8-131486e3636e\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.227385 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-trusted-ca\") pod \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\" (UID: \"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.227403 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-utilities\") pod \"727c69c5-0eaa-4dba-b5a8-131486e3636e\" (UID: \"727c69c5-0eaa-4dba-b5a8-131486e3636e\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.227421 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c45pj\" (UniqueName: \"kubernetes.io/projected/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-kube-api-access-c45pj\") pod \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.227439 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-catalog-content\") pod \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\" (UID: \"3c76d8db-b385-49b1-b8cf-f7286f3e49c2\") " Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.228147 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-utilities" (OuterVolumeSpecName: "utilities") pod "727c69c5-0eaa-4dba-b5a8-131486e3636e" (UID: "727c69c5-0eaa-4dba-b5a8-131486e3636e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.228183 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed" (UID: "bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.228460 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-utilities" (OuterVolumeSpecName: "utilities") pod "3c76d8db-b385-49b1-b8cf-f7286f3e49c2" (UID: "3c76d8db-b385-49b1-b8cf-f7286f3e49c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.230426 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727c69c5-0eaa-4dba-b5a8-131486e3636e-kube-api-access-4cvmt" (OuterVolumeSpecName: "kube-api-access-4cvmt") pod "727c69c5-0eaa-4dba-b5a8-131486e3636e" (UID: "727c69c5-0eaa-4dba-b5a8-131486e3636e"). InnerVolumeSpecName "kube-api-access-4cvmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.235891 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.236329 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc66cc3-d679-469f-9b34-021345e9007f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.236511 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cvmt\" (UniqueName: \"kubernetes.io/projected/727c69c5-0eaa-4dba-b5a8-131486e3636e-kube-api-access-4cvmt\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.236527 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.236539 4636 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.236550 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.236631 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-kube-api-access-c45pj" (OuterVolumeSpecName: "kube-api-access-c45pj") pod "3c76d8db-b385-49b1-b8cf-f7286f3e49c2" (UID: "3c76d8db-b385-49b1-b8cf-f7286f3e49c2"). InnerVolumeSpecName "kube-api-access-c45pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.238112 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-kube-api-access-shpqx" (OuterVolumeSpecName: "kube-api-access-shpqx") pod "bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed" (UID: "bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed"). InnerVolumeSpecName "kube-api-access-shpqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.240223 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "727c69c5-0eaa-4dba-b5a8-131486e3636e" (UID: "727c69c5-0eaa-4dba-b5a8-131486e3636e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.273127 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c76d8db-b385-49b1-b8cf-f7286f3e49c2" (UID: "3c76d8db-b385-49b1-b8cf-f7286f3e49c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.314054 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tzzbm"] Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.356510 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tzzbm"] Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.357302 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shpqx\" (UniqueName: \"kubernetes.io/projected/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed-kube-api-access-shpqx\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.357323 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/727c69c5-0eaa-4dba-b5a8-131486e3636e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.357335 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c45pj\" (UniqueName: \"kubernetes.io/projected/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-kube-api-access-c45pj\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.357349 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c76d8db-b385-49b1-b8cf-f7286f3e49c2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.435607 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ncqfg"] Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.996048 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nswpx" event={"ID":"727c69c5-0eaa-4dba-b5a8-131486e3636e","Type":"ContainerDied","Data":"f032d7e083943c1c6c8586cbe76cfee7835bbcd44571efc9a7022030c10f9a38"} Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.996354 4636 scope.go:117] "RemoveContainer" containerID="ddc3865907961ed5eedde01e42691566be06720ac2ebaf3a3c07f916af94861a" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.996087 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nswpx" Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.998215 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" event={"ID":"eb4639ab-5b3c-4f36-9c1e-077930e571e3","Type":"ContainerStarted","Data":"3158fab26eabbb788bc8301a979bf9663b0fe5d920aac5ee55f83af7547770da"} Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.998247 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" event={"ID":"eb4639ab-5b3c-4f36-9c1e-077930e571e3","Type":"ContainerStarted","Data":"9f14ef21aad5969864661b31ceb6fc3cd62f2b203bc4b43b3f76127dae60403f"} Oct 03 14:05:37 crc kubenswrapper[4636]: I1003 14:05:37.998953 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.001435 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zv2qw" event={"ID":"8cc66cc3-d679-469f-9b34-021345e9007f","Type":"ContainerDied","Data":"d2ed9f937758375c1da5307806db4308bb9f8504f76bbfa83001d2184d63c156"} Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.001457 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zv2qw" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.002916 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.005424 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" event={"ID":"bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed","Type":"ContainerDied","Data":"769365e1af4a59f983c2e0c3faf59ba99cc52b5b7dc891bd343302f93b44e050"} Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.005505 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rjp5j" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.008844 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q4hj" event={"ID":"3c76d8db-b385-49b1-b8cf-f7286f3e49c2","Type":"ContainerDied","Data":"0fdf2896d034ddd9e0a4a0185db96a833091514e533bb3de7d7b294c930135b6"} Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.008915 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7q4hj" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.018853 4636 scope.go:117] "RemoveContainer" containerID="d1618039f1da37bef855f651d1b78119b06f78365af3d1a576d5c20e8bd261dc" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.048206 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ncqfg" podStartSLOduration=2.048185924 podStartE2EDuration="2.048185924s" podCreationTimestamp="2025-10-03 14:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:05:38.023333179 +0000 UTC m=+287.882059436" watchObservedRunningTime="2025-10-03 14:05:38.048185924 +0000 UTC m=+287.906912171" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.052179 4636 scope.go:117] "RemoveContainer" containerID="73f743169419e0c48902a37f5f6c65c6362b2b07ea9654e3b3bdb41f93aefd2e" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.070056 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zv2qw"] Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.083880 4636 scope.go:117] "RemoveContainer" containerID="918207e92eb987cc1f203c45e5952c5b949098baed7a097e80af88ceeaa1cb51" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.086796 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zv2qw"] Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.091802 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nswpx"] Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.095682 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nswpx"] Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.099171 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rjp5j"] Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.101578 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rjp5j"] Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.105050 4636 scope.go:117] "RemoveContainer" containerID="c42ed1daf9e0e1edb0fce58be39b314e9c4812a34d6656b31d2624ad150a51a5" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.120962 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7q4hj"] Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.126184 4636 scope.go:117] "RemoveContainer" containerID="1d956c3f04c3abf69cf8f2d22c171011fad73a6ccceb8158ac7ea24a3befc2e5" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.138904 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7q4hj"] Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.153300 4636 scope.go:117] "RemoveContainer" containerID="aadee51d005ae8bf2c42a2dd7725e0294b6f25c145eedff34843e2748e4177de" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.174796 4636 scope.go:117] "RemoveContainer" containerID="2bc71288f48749138f4364dd5760c4be53fa45b7eb21a16a8d7e5d59204251fa" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.188066 4636 scope.go:117] "RemoveContainer" containerID="fedfe74661c83271b096815364d09726924ec85ee2239e0baf5c54be467505bb" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.201055 4636 scope.go:117] "RemoveContainer" containerID="fa4f4aa981db7a9f369e06583da3251819c06265175662a399a8254fe838e68f" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573482 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p7wgr"] Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573667 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727c69c5-0eaa-4dba-b5a8-131486e3636e" containerName="extract-utilities" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573681 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="727c69c5-0eaa-4dba-b5a8-131486e3636e" containerName="extract-utilities" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573691 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed" containerName="marketplace-operator" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573698 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed" containerName="marketplace-operator" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573710 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" containerName="extract-content" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573717 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" containerName="extract-content" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573730 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc66cc3-d679-469f-9b34-021345e9007f" containerName="extract-utilities" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573738 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc66cc3-d679-469f-9b34-021345e9007f" containerName="extract-utilities" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573747 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727c69c5-0eaa-4dba-b5a8-131486e3636e" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573754 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="727c69c5-0eaa-4dba-b5a8-131486e3636e" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573765 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" containerName="extract-utilities" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573772 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" containerName="extract-utilities" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573779 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" containerName="extract-content" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573786 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" containerName="extract-content" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573793 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727c69c5-0eaa-4dba-b5a8-131486e3636e" containerName="extract-content" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573800 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="727c69c5-0eaa-4dba-b5a8-131486e3636e" containerName="extract-content" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573810 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573817 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573826 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc66cc3-d679-469f-9b34-021345e9007f" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573832 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc66cc3-d679-469f-9b34-021345e9007f" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573840 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573846 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573857 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" containerName="extract-utilities" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573864 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" containerName="extract-utilities" Oct 03 14:05:38 crc kubenswrapper[4636]: E1003 14:05:38.573874 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc66cc3-d679-469f-9b34-021345e9007f" containerName="extract-content" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573880 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc66cc3-d679-469f-9b34-021345e9007f" containerName="extract-content" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573969 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573980 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed" containerName="marketplace-operator" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573989 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cc66cc3-d679-469f-9b34-021345e9007f" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.573999 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.574008 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="727c69c5-0eaa-4dba-b5a8-131486e3636e" containerName="registry-server" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.574691 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.579410 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.586209 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p7wgr"] Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.673787 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f8d0287-e1cd-461f-917e-febaa7ac576e-catalog-content\") pod \"community-operators-p7wgr\" (UID: \"6f8d0287-e1cd-461f-917e-febaa7ac576e\") " pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.673860 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z98v\" (UniqueName: \"kubernetes.io/projected/6f8d0287-e1cd-461f-917e-febaa7ac576e-kube-api-access-8z98v\") pod \"community-operators-p7wgr\" (UID: \"6f8d0287-e1cd-461f-917e-febaa7ac576e\") " pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.673940 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f8d0287-e1cd-461f-917e-febaa7ac576e-utilities\") pod \"community-operators-p7wgr\" (UID: \"6f8d0287-e1cd-461f-917e-febaa7ac576e\") " pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.774518 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k8gjq"] Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.775384 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.775491 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f8d0287-e1cd-461f-917e-febaa7ac576e-utilities\") pod \"community-operators-p7wgr\" (UID: \"6f8d0287-e1cd-461f-917e-febaa7ac576e\") " pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.775557 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f8d0287-e1cd-461f-917e-febaa7ac576e-catalog-content\") pod \"community-operators-p7wgr\" (UID: \"6f8d0287-e1cd-461f-917e-febaa7ac576e\") " pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.775605 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z98v\" (UniqueName: \"kubernetes.io/projected/6f8d0287-e1cd-461f-917e-febaa7ac576e-kube-api-access-8z98v\") pod \"community-operators-p7wgr\" (UID: \"6f8d0287-e1cd-461f-917e-febaa7ac576e\") " pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.776048 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f8d0287-e1cd-461f-917e-febaa7ac576e-utilities\") pod \"community-operators-p7wgr\" (UID: \"6f8d0287-e1cd-461f-917e-febaa7ac576e\") " pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.776070 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f8d0287-e1cd-461f-917e-febaa7ac576e-catalog-content\") pod \"community-operators-p7wgr\" (UID: \"6f8d0287-e1cd-461f-917e-febaa7ac576e\") " pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.777799 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.788831 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8gjq"] Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.802119 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c76d8db-b385-49b1-b8cf-f7286f3e49c2" path="/var/lib/kubelet/pods/3c76d8db-b385-49b1-b8cf-f7286f3e49c2/volumes" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.803813 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="727c69c5-0eaa-4dba-b5a8-131486e3636e" path="/var/lib/kubelet/pods/727c69c5-0eaa-4dba-b5a8-131486e3636e/volumes" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.804541 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc66cc3-d679-469f-9b34-021345e9007f" path="/var/lib/kubelet/pods/8cc66cc3-d679-469f-9b34-021345e9007f/volumes" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.805844 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34" path="/var/lib/kubelet/pods/a7ff94a3-2b20-4ecb-a17e-bac9c51e2b34/volumes" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.805924 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z98v\" (UniqueName: \"kubernetes.io/projected/6f8d0287-e1cd-461f-917e-febaa7ac576e-kube-api-access-8z98v\") pod \"community-operators-p7wgr\" (UID: \"6f8d0287-e1cd-461f-917e-febaa7ac576e\") " pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.806605 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed" path="/var/lib/kubelet/pods/bf1dc0fc-7cd7-46c2-8d8f-ae889ce93aed/volumes" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.876607 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8df150bb-9cae-4839-bc31-0211d3610788-catalog-content\") pod \"redhat-marketplace-k8gjq\" (UID: \"8df150bb-9cae-4839-bc31-0211d3610788\") " pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.876667 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dpsw\" (UniqueName: \"kubernetes.io/projected/8df150bb-9cae-4839-bc31-0211d3610788-kube-api-access-8dpsw\") pod \"redhat-marketplace-k8gjq\" (UID: \"8df150bb-9cae-4839-bc31-0211d3610788\") " pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.876700 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8df150bb-9cae-4839-bc31-0211d3610788-utilities\") pod \"redhat-marketplace-k8gjq\" (UID: \"8df150bb-9cae-4839-bc31-0211d3610788\") " pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.893292 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.978886 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8df150bb-9cae-4839-bc31-0211d3610788-catalog-content\") pod \"redhat-marketplace-k8gjq\" (UID: \"8df150bb-9cae-4839-bc31-0211d3610788\") " pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.978933 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dpsw\" (UniqueName: \"kubernetes.io/projected/8df150bb-9cae-4839-bc31-0211d3610788-kube-api-access-8dpsw\") pod \"redhat-marketplace-k8gjq\" (UID: \"8df150bb-9cae-4839-bc31-0211d3610788\") " pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.978960 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8df150bb-9cae-4839-bc31-0211d3610788-utilities\") pod \"redhat-marketplace-k8gjq\" (UID: \"8df150bb-9cae-4839-bc31-0211d3610788\") " pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.979402 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8df150bb-9cae-4839-bc31-0211d3610788-utilities\") pod \"redhat-marketplace-k8gjq\" (UID: \"8df150bb-9cae-4839-bc31-0211d3610788\") " pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:38 crc kubenswrapper[4636]: I1003 14:05:38.979479 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8df150bb-9cae-4839-bc31-0211d3610788-catalog-content\") pod \"redhat-marketplace-k8gjq\" (UID: \"8df150bb-9cae-4839-bc31-0211d3610788\") " pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:39 crc kubenswrapper[4636]: I1003 14:05:39.005178 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dpsw\" (UniqueName: \"kubernetes.io/projected/8df150bb-9cae-4839-bc31-0211d3610788-kube-api-access-8dpsw\") pod \"redhat-marketplace-k8gjq\" (UID: \"8df150bb-9cae-4839-bc31-0211d3610788\") " pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:39 crc kubenswrapper[4636]: I1003 14:05:39.091572 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:39 crc kubenswrapper[4636]: I1003 14:05:39.279688 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p7wgr"] Oct 03 14:05:39 crc kubenswrapper[4636]: W1003 14:05:39.287746 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f8d0287_e1cd_461f_917e_febaa7ac576e.slice/crio-145f28af9491479fc78d6eb7b4e7d455f1059c052da21ca7c99f6dc7ffb34ed4 WatchSource:0}: Error finding container 145f28af9491479fc78d6eb7b4e7d455f1059c052da21ca7c99f6dc7ffb34ed4: Status 404 returned error can't find the container with id 145f28af9491479fc78d6eb7b4e7d455f1059c052da21ca7c99f6dc7ffb34ed4 Oct 03 14:05:39 crc kubenswrapper[4636]: I1003 14:05:39.450452 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8gjq"] Oct 03 14:05:40 crc kubenswrapper[4636]: I1003 14:05:40.034012 4636 generic.go:334] "Generic (PLEG): container finished" podID="8df150bb-9cae-4839-bc31-0211d3610788" containerID="9bf3443e61927e4efe5dbf475938ce6ca8a91d7bd88507f6b3bda02fa8a069e6" exitCode=0 Oct 03 14:05:40 crc kubenswrapper[4636]: I1003 14:05:40.034134 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8gjq" event={"ID":"8df150bb-9cae-4839-bc31-0211d3610788","Type":"ContainerDied","Data":"9bf3443e61927e4efe5dbf475938ce6ca8a91d7bd88507f6b3bda02fa8a069e6"} Oct 03 14:05:40 crc kubenswrapper[4636]: I1003 14:05:40.034409 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8gjq" event={"ID":"8df150bb-9cae-4839-bc31-0211d3610788","Type":"ContainerStarted","Data":"cce19d263a2d18947c4bb48f887ea07429fc02e8a51b0f4de5ec3f2746cf0e80"} Oct 03 14:05:40 crc kubenswrapper[4636]: I1003 14:05:40.036913 4636 generic.go:334] "Generic (PLEG): container finished" podID="6f8d0287-e1cd-461f-917e-febaa7ac576e" containerID="255483314a9436c189e66fa1d2f189104b1e2f34daab8bac58583e13262933c4" exitCode=0 Oct 03 14:05:40 crc kubenswrapper[4636]: I1003 14:05:40.036984 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p7wgr" event={"ID":"6f8d0287-e1cd-461f-917e-febaa7ac576e","Type":"ContainerDied","Data":"255483314a9436c189e66fa1d2f189104b1e2f34daab8bac58583e13262933c4"} Oct 03 14:05:40 crc kubenswrapper[4636]: I1003 14:05:40.037016 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p7wgr" event={"ID":"6f8d0287-e1cd-461f-917e-febaa7ac576e","Type":"ContainerStarted","Data":"145f28af9491479fc78d6eb7b4e7d455f1059c052da21ca7c99f6dc7ffb34ed4"} Oct 03 14:05:40 crc kubenswrapper[4636]: I1003 14:05:40.982329 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5cx5c"] Oct 03 14:05:40 crc kubenswrapper[4636]: I1003 14:05:40.984080 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:40 crc kubenswrapper[4636]: I1003 14:05:40.986723 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.007257 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5cx5c"] Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.103367 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/955b6210-120c-4407-a1b0-2565f8407a8f-catalog-content\") pod \"certified-operators-5cx5c\" (UID: \"955b6210-120c-4407-a1b0-2565f8407a8f\") " pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.103639 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/955b6210-120c-4407-a1b0-2565f8407a8f-utilities\") pod \"certified-operators-5cx5c\" (UID: \"955b6210-120c-4407-a1b0-2565f8407a8f\") " pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.103709 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fx9\" (UniqueName: \"kubernetes.io/projected/955b6210-120c-4407-a1b0-2565f8407a8f-kube-api-access-27fx9\") pod \"certified-operators-5cx5c\" (UID: \"955b6210-120c-4407-a1b0-2565f8407a8f\") " pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.174033 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s4hpr"] Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.175139 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.179499 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.181214 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s4hpr"] Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.205427 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/955b6210-120c-4407-a1b0-2565f8407a8f-utilities\") pod \"certified-operators-5cx5c\" (UID: \"955b6210-120c-4407-a1b0-2565f8407a8f\") " pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.205777 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27fx9\" (UniqueName: \"kubernetes.io/projected/955b6210-120c-4407-a1b0-2565f8407a8f-kube-api-access-27fx9\") pod \"certified-operators-5cx5c\" (UID: \"955b6210-120c-4407-a1b0-2565f8407a8f\") " pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.205812 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/955b6210-120c-4407-a1b0-2565f8407a8f-catalog-content\") pod \"certified-operators-5cx5c\" (UID: \"955b6210-120c-4407-a1b0-2565f8407a8f\") " pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.205925 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/955b6210-120c-4407-a1b0-2565f8407a8f-utilities\") pod \"certified-operators-5cx5c\" (UID: \"955b6210-120c-4407-a1b0-2565f8407a8f\") " pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.206143 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/955b6210-120c-4407-a1b0-2565f8407a8f-catalog-content\") pod \"certified-operators-5cx5c\" (UID: \"955b6210-120c-4407-a1b0-2565f8407a8f\") " pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.222545 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fx9\" (UniqueName: \"kubernetes.io/projected/955b6210-120c-4407-a1b0-2565f8407a8f-kube-api-access-27fx9\") pod \"certified-operators-5cx5c\" (UID: \"955b6210-120c-4407-a1b0-2565f8407a8f\") " pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.307444 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtcfb\" (UniqueName: \"kubernetes.io/projected/7677fae0-2c20-47c0-aae2-52657add9d92-kube-api-access-xtcfb\") pod \"redhat-operators-s4hpr\" (UID: \"7677fae0-2c20-47c0-aae2-52657add9d92\") " pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.307543 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7677fae0-2c20-47c0-aae2-52657add9d92-utilities\") pod \"redhat-operators-s4hpr\" (UID: \"7677fae0-2c20-47c0-aae2-52657add9d92\") " pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.307603 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7677fae0-2c20-47c0-aae2-52657add9d92-catalog-content\") pod \"redhat-operators-s4hpr\" (UID: \"7677fae0-2c20-47c0-aae2-52657add9d92\") " pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.315966 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.408606 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7677fae0-2c20-47c0-aae2-52657add9d92-utilities\") pod \"redhat-operators-s4hpr\" (UID: \"7677fae0-2c20-47c0-aae2-52657add9d92\") " pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.408834 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7677fae0-2c20-47c0-aae2-52657add9d92-catalog-content\") pod \"redhat-operators-s4hpr\" (UID: \"7677fae0-2c20-47c0-aae2-52657add9d92\") " pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.408974 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtcfb\" (UniqueName: \"kubernetes.io/projected/7677fae0-2c20-47c0-aae2-52657add9d92-kube-api-access-xtcfb\") pod \"redhat-operators-s4hpr\" (UID: \"7677fae0-2c20-47c0-aae2-52657add9d92\") " pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.410893 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7677fae0-2c20-47c0-aae2-52657add9d92-catalog-content\") pod \"redhat-operators-s4hpr\" (UID: \"7677fae0-2c20-47c0-aae2-52657add9d92\") " pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.412342 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7677fae0-2c20-47c0-aae2-52657add9d92-utilities\") pod \"redhat-operators-s4hpr\" (UID: \"7677fae0-2c20-47c0-aae2-52657add9d92\") " pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.430889 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtcfb\" (UniqueName: \"kubernetes.io/projected/7677fae0-2c20-47c0-aae2-52657add9d92-kube-api-access-xtcfb\") pod \"redhat-operators-s4hpr\" (UID: \"7677fae0-2c20-47c0-aae2-52657add9d92\") " pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.492223 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.711963 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s4hpr"] Oct 03 14:05:41 crc kubenswrapper[4636]: I1003 14:05:41.742602 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5cx5c"] Oct 03 14:05:41 crc kubenswrapper[4636]: W1003 14:05:41.760586 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955b6210_120c_4407_a1b0_2565f8407a8f.slice/crio-b64c7e3f4f19da8141d198ed1ff2b72322fc56409bb14259fca6f9c5d68183f3 WatchSource:0}: Error finding container b64c7e3f4f19da8141d198ed1ff2b72322fc56409bb14259fca6f9c5d68183f3: Status 404 returned error can't find the container with id b64c7e3f4f19da8141d198ed1ff2b72322fc56409bb14259fca6f9c5d68183f3 Oct 03 14:05:41 crc kubenswrapper[4636]: W1003 14:05:41.760914 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7677fae0_2c20_47c0_aae2_52657add9d92.slice/crio-06bf1eab79efbeebf554ae9f54db4b500895da80e58063b8339520ac77e48ec3 WatchSource:0}: Error finding container 06bf1eab79efbeebf554ae9f54db4b500895da80e58063b8339520ac77e48ec3: Status 404 returned error can't find the container with id 06bf1eab79efbeebf554ae9f54db4b500895da80e58063b8339520ac77e48ec3 Oct 03 14:05:42 crc kubenswrapper[4636]: I1003 14:05:42.049071 4636 generic.go:334] "Generic (PLEG): container finished" podID="8df150bb-9cae-4839-bc31-0211d3610788" containerID="2944e41056a6be4861b32604e6852bc30397848f00c7bf6af0e7cdb7a5acb1dd" exitCode=0 Oct 03 14:05:42 crc kubenswrapper[4636]: I1003 14:05:42.049183 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8gjq" event={"ID":"8df150bb-9cae-4839-bc31-0211d3610788","Type":"ContainerDied","Data":"2944e41056a6be4861b32604e6852bc30397848f00c7bf6af0e7cdb7a5acb1dd"} Oct 03 14:05:42 crc kubenswrapper[4636]: I1003 14:05:42.052239 4636 generic.go:334] "Generic (PLEG): container finished" podID="7677fae0-2c20-47c0-aae2-52657add9d92" containerID="e0ae835d4ba27b8b7e1e35aedd78c9c96c344265be13db09309627148335b465" exitCode=0 Oct 03 14:05:42 crc kubenswrapper[4636]: I1003 14:05:42.052306 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4hpr" event={"ID":"7677fae0-2c20-47c0-aae2-52657add9d92","Type":"ContainerDied","Data":"e0ae835d4ba27b8b7e1e35aedd78c9c96c344265be13db09309627148335b465"} Oct 03 14:05:42 crc kubenswrapper[4636]: I1003 14:05:42.052332 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4hpr" event={"ID":"7677fae0-2c20-47c0-aae2-52657add9d92","Type":"ContainerStarted","Data":"06bf1eab79efbeebf554ae9f54db4b500895da80e58063b8339520ac77e48ec3"} Oct 03 14:05:42 crc kubenswrapper[4636]: I1003 14:05:42.061777 4636 generic.go:334] "Generic (PLEG): container finished" podID="955b6210-120c-4407-a1b0-2565f8407a8f" containerID="2fe64cba97c6c6d8b22acc58a02fe7e207962939dab6f2f6d95091507e3277fb" exitCode=0 Oct 03 14:05:42 crc kubenswrapper[4636]: I1003 14:05:42.061914 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cx5c" event={"ID":"955b6210-120c-4407-a1b0-2565f8407a8f","Type":"ContainerDied","Data":"2fe64cba97c6c6d8b22acc58a02fe7e207962939dab6f2f6d95091507e3277fb"} Oct 03 14:05:42 crc kubenswrapper[4636]: I1003 14:05:42.061947 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cx5c" event={"ID":"955b6210-120c-4407-a1b0-2565f8407a8f","Type":"ContainerStarted","Data":"b64c7e3f4f19da8141d198ed1ff2b72322fc56409bb14259fca6f9c5d68183f3"} Oct 03 14:05:42 crc kubenswrapper[4636]: I1003 14:05:42.068220 4636 generic.go:334] "Generic (PLEG): container finished" podID="6f8d0287-e1cd-461f-917e-febaa7ac576e" containerID="b864685dc3616f659f37040b4036173a12c07119e789c7aa702554efea851229" exitCode=0 Oct 03 14:05:42 crc kubenswrapper[4636]: I1003 14:05:42.068271 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p7wgr" event={"ID":"6f8d0287-e1cd-461f-917e-febaa7ac576e","Type":"ContainerDied","Data":"b864685dc3616f659f37040b4036173a12c07119e789c7aa702554efea851229"} Oct 03 14:05:43 crc kubenswrapper[4636]: I1003 14:05:43.074683 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8gjq" event={"ID":"8df150bb-9cae-4839-bc31-0211d3610788","Type":"ContainerStarted","Data":"721cb88dcc37d93337755e8e59c2419140b3d531e50634672e51c9a11081c850"} Oct 03 14:05:43 crc kubenswrapper[4636]: I1003 14:05:43.077381 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cx5c" event={"ID":"955b6210-120c-4407-a1b0-2565f8407a8f","Type":"ContainerStarted","Data":"f8de561966661341bac5bcc3f19c64533412d88f4069f974a45fc46419cf5fe2"} Oct 03 14:05:43 crc kubenswrapper[4636]: I1003 14:05:43.080198 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p7wgr" event={"ID":"6f8d0287-e1cd-461f-917e-febaa7ac576e","Type":"ContainerStarted","Data":"2b5e8ae170720b24a9b78b071c66038869d3ba5ebc82e20c1e1ec69542932d97"} Oct 03 14:05:43 crc kubenswrapper[4636]: I1003 14:05:43.095921 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k8gjq" podStartSLOduration=2.591747003 podStartE2EDuration="5.095902708s" podCreationTimestamp="2025-10-03 14:05:38 +0000 UTC" firstStartedPulling="2025-10-03 14:05:40.047147899 +0000 UTC m=+289.905874146" lastFinishedPulling="2025-10-03 14:05:42.551303604 +0000 UTC m=+292.410029851" observedRunningTime="2025-10-03 14:05:43.094636441 +0000 UTC m=+292.953362708" watchObservedRunningTime="2025-10-03 14:05:43.095902708 +0000 UTC m=+292.954628955" Oct 03 14:05:43 crc kubenswrapper[4636]: I1003 14:05:43.130884 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p7wgr" podStartSLOduration=2.675387127 podStartE2EDuration="5.130869683s" podCreationTimestamp="2025-10-03 14:05:38 +0000 UTC" firstStartedPulling="2025-10-03 14:05:40.046728527 +0000 UTC m=+289.905454794" lastFinishedPulling="2025-10-03 14:05:42.502211103 +0000 UTC m=+292.360937350" observedRunningTime="2025-10-03 14:05:43.112627578 +0000 UTC m=+292.971353835" watchObservedRunningTime="2025-10-03 14:05:43.130869683 +0000 UTC m=+292.989595930" Oct 03 14:05:44 crc kubenswrapper[4636]: I1003 14:05:44.087033 4636 generic.go:334] "Generic (PLEG): container finished" podID="955b6210-120c-4407-a1b0-2565f8407a8f" containerID="f8de561966661341bac5bcc3f19c64533412d88f4069f974a45fc46419cf5fe2" exitCode=0 Oct 03 14:05:44 crc kubenswrapper[4636]: I1003 14:05:44.087075 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cx5c" event={"ID":"955b6210-120c-4407-a1b0-2565f8407a8f","Type":"ContainerDied","Data":"f8de561966661341bac5bcc3f19c64533412d88f4069f974a45fc46419cf5fe2"} Oct 03 14:05:44 crc kubenswrapper[4636]: I1003 14:05:44.089308 4636 generic.go:334] "Generic (PLEG): container finished" podID="7677fae0-2c20-47c0-aae2-52657add9d92" containerID="1a553c60799d07e2408949d4e8f3039fb7cf34cc0e79841ca0ec249434d221b6" exitCode=0 Oct 03 14:05:44 crc kubenswrapper[4636]: I1003 14:05:44.089369 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4hpr" event={"ID":"7677fae0-2c20-47c0-aae2-52657add9d92","Type":"ContainerDied","Data":"1a553c60799d07e2408949d4e8f3039fb7cf34cc0e79841ca0ec249434d221b6"} Oct 03 14:05:45 crc kubenswrapper[4636]: I1003 14:05:45.096083 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cx5c" event={"ID":"955b6210-120c-4407-a1b0-2565f8407a8f","Type":"ContainerStarted","Data":"23dd653e0cc1346991ad66d27276d983ab31dfca04c43246085f5bbad675c944"} Oct 03 14:05:45 crc kubenswrapper[4636]: I1003 14:05:45.098637 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4hpr" event={"ID":"7677fae0-2c20-47c0-aae2-52657add9d92","Type":"ContainerStarted","Data":"bf807f2dfef0c5a3013064ecd73be2b332e3a466989ec53e327885ced0ba2eb7"} Oct 03 14:05:45 crc kubenswrapper[4636]: I1003 14:05:45.132255 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5cx5c" podStartSLOduration=2.638535143 podStartE2EDuration="5.132239888s" podCreationTimestamp="2025-10-03 14:05:40 +0000 UTC" firstStartedPulling="2025-10-03 14:05:42.070245898 +0000 UTC m=+291.928972145" lastFinishedPulling="2025-10-03 14:05:44.563950643 +0000 UTC m=+294.422676890" observedRunningTime="2025-10-03 14:05:45.114529639 +0000 UTC m=+294.973255906" watchObservedRunningTime="2025-10-03 14:05:45.132239888 +0000 UTC m=+294.990966135" Oct 03 14:05:45 crc kubenswrapper[4636]: I1003 14:05:45.133695 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s4hpr" podStartSLOduration=1.415064629 podStartE2EDuration="4.133684419s" podCreationTimestamp="2025-10-03 14:05:41 +0000 UTC" firstStartedPulling="2025-10-03 14:05:42.056827723 +0000 UTC m=+291.915553970" lastFinishedPulling="2025-10-03 14:05:44.775447513 +0000 UTC m=+294.634173760" observedRunningTime="2025-10-03 14:05:45.130631921 +0000 UTC m=+294.989358168" watchObservedRunningTime="2025-10-03 14:05:45.133684419 +0000 UTC m=+294.992410676" Oct 03 14:05:48 crc kubenswrapper[4636]: I1003 14:05:48.894694 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:48 crc kubenswrapper[4636]: I1003 14:05:48.895238 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:48 crc kubenswrapper[4636]: I1003 14:05:48.937947 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:49 crc kubenswrapper[4636]: I1003 14:05:49.091732 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:49 crc kubenswrapper[4636]: I1003 14:05:49.091826 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:49 crc kubenswrapper[4636]: I1003 14:05:49.139185 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:49 crc kubenswrapper[4636]: I1003 14:05:49.164269 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p7wgr" Oct 03 14:05:50 crc kubenswrapper[4636]: I1003 14:05:50.157481 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k8gjq" Oct 03 14:05:51 crc kubenswrapper[4636]: I1003 14:05:51.316668 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:51 crc kubenswrapper[4636]: I1003 14:05:51.316994 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:51 crc kubenswrapper[4636]: I1003 14:05:51.353431 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:51 crc kubenswrapper[4636]: I1003 14:05:51.493057 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:51 crc kubenswrapper[4636]: I1003 14:05:51.493133 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:51 crc kubenswrapper[4636]: I1003 14:05:51.529633 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:05:52 crc kubenswrapper[4636]: I1003 14:05:52.179660 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5cx5c" Oct 03 14:05:52 crc kubenswrapper[4636]: I1003 14:05:52.180243 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s4hpr" Oct 03 14:07:09 crc kubenswrapper[4636]: I1003 14:07:09.163295 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:07:09 crc kubenswrapper[4636]: I1003 14:07:09.163842 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:07:39 crc kubenswrapper[4636]: I1003 14:07:39.163234 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:07:39 crc kubenswrapper[4636]: I1003 14:07:39.163828 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:08:09 crc kubenswrapper[4636]: I1003 14:08:09.163389 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:08:09 crc kubenswrapper[4636]: I1003 14:08:09.163800 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:08:09 crc kubenswrapper[4636]: I1003 14:08:09.163845 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:08:09 crc kubenswrapper[4636]: I1003 14:08:09.164429 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c0c4ea124622f317166a6ea5cb84988a6632a919c472664da27e020c7262591"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:08:09 crc kubenswrapper[4636]: I1003 14:08:09.164478 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://5c0c4ea124622f317166a6ea5cb84988a6632a919c472664da27e020c7262591" gracePeriod=600 Oct 03 14:08:09 crc kubenswrapper[4636]: I1003 14:08:09.820013 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="5c0c4ea124622f317166a6ea5cb84988a6632a919c472664da27e020c7262591" exitCode=0 Oct 03 14:08:09 crc kubenswrapper[4636]: I1003 14:08:09.820127 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"5c0c4ea124622f317166a6ea5cb84988a6632a919c472664da27e020c7262591"} Oct 03 14:08:09 crc kubenswrapper[4636]: I1003 14:08:09.820369 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"397ecbf6846cc3b94251ba0c02a817d6a89b5e1e5d3d2333691e31cc8372c3fc"} Oct 03 14:08:09 crc kubenswrapper[4636]: I1003 14:08:09.820409 4636 scope.go:117] "RemoveContainer" containerID="8335b1869c8e33ac9e68f7823e5c05482d9a174552f24c31020da3ff45732e5f" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.465520 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w4hpz"] Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.467021 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.480626 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w4hpz"] Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.613609 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81af1b84-8951-4945-92dd-fcaa686d3d07-bound-sa-token\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.613745 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81af1b84-8951-4945-92dd-fcaa686d3d07-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.614019 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81af1b84-8951-4945-92dd-fcaa686d3d07-registry-certificates\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.614084 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81af1b84-8951-4945-92dd-fcaa686d3d07-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.614203 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81af1b84-8951-4945-92dd-fcaa686d3d07-trusted-ca\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.614236 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nll99\" (UniqueName: \"kubernetes.io/projected/81af1b84-8951-4945-92dd-fcaa686d3d07-kube-api-access-nll99\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.614261 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81af1b84-8951-4945-92dd-fcaa686d3d07-registry-tls\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.614304 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.637667 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.715482 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81af1b84-8951-4945-92dd-fcaa686d3d07-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.715546 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81af1b84-8951-4945-92dd-fcaa686d3d07-registry-certificates\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.715572 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81af1b84-8951-4945-92dd-fcaa686d3d07-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.715609 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81af1b84-8951-4945-92dd-fcaa686d3d07-trusted-ca\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.715635 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nll99\" (UniqueName: \"kubernetes.io/projected/81af1b84-8951-4945-92dd-fcaa686d3d07-kube-api-access-nll99\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.715653 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81af1b84-8951-4945-92dd-fcaa686d3d07-registry-tls\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.715843 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81af1b84-8951-4945-92dd-fcaa686d3d07-bound-sa-token\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.716644 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81af1b84-8951-4945-92dd-fcaa686d3d07-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.716907 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81af1b84-8951-4945-92dd-fcaa686d3d07-registry-certificates\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.716975 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81af1b84-8951-4945-92dd-fcaa686d3d07-trusted-ca\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.721415 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81af1b84-8951-4945-92dd-fcaa686d3d07-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.721667 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81af1b84-8951-4945-92dd-fcaa686d3d07-registry-tls\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.738255 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81af1b84-8951-4945-92dd-fcaa686d3d07-bound-sa-token\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.738254 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nll99\" (UniqueName: \"kubernetes.io/projected/81af1b84-8951-4945-92dd-fcaa686d3d07-kube-api-access-nll99\") pod \"image-registry-66df7c8f76-w4hpz\" (UID: \"81af1b84-8951-4945-92dd-fcaa686d3d07\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.806515 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:46 crc kubenswrapper[4636]: I1003 14:08:46.990630 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w4hpz"] Oct 03 14:08:47 crc kubenswrapper[4636]: I1003 14:08:47.033751 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" event={"ID":"81af1b84-8951-4945-92dd-fcaa686d3d07","Type":"ContainerStarted","Data":"f49e4db3746b19d69eb66c7acd23481ec4ff47ae94d3fdec5586fe9e2ec6859d"} Oct 03 14:08:48 crc kubenswrapper[4636]: I1003 14:08:48.040058 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" event={"ID":"81af1b84-8951-4945-92dd-fcaa686d3d07","Type":"ContainerStarted","Data":"c870c904fd27eb4e767a0cc7ab3be9470ca3cf862198c4f72c4c6fd90712a086"} Oct 03 14:08:48 crc kubenswrapper[4636]: I1003 14:08:48.041429 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:08:48 crc kubenswrapper[4636]: I1003 14:08:48.062802 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" podStartSLOduration=2.062778825 podStartE2EDuration="2.062778825s" podCreationTimestamp="2025-10-03 14:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:08:48.061490782 +0000 UTC m=+477.920217029" watchObservedRunningTime="2025-10-03 14:08:48.062778825 +0000 UTC m=+477.921505072" Oct 03 14:09:06 crc kubenswrapper[4636]: I1003 14:09:06.811025 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-w4hpz" Oct 03 14:09:06 crc kubenswrapper[4636]: I1003 14:09:06.869922 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pk6zb"] Oct 03 14:09:31 crc kubenswrapper[4636]: I1003 14:09:31.921034 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" podUID="967e0eca-11d1-4fb6-bba5-5fe993aaeac3" containerName="registry" containerID="cri-o://bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132" gracePeriod=30 Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.226065 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.263098 4636 generic.go:334] "Generic (PLEG): container finished" podID="967e0eca-11d1-4fb6-bba5-5fe993aaeac3" containerID="bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132" exitCode=0 Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.263202 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" event={"ID":"967e0eca-11d1-4fb6-bba5-5fe993aaeac3","Type":"ContainerDied","Data":"bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132"} Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.263233 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" event={"ID":"967e0eca-11d1-4fb6-bba5-5fe993aaeac3","Type":"ContainerDied","Data":"cb946e2061c6133754f8a032fd847b9ce6096233f41475de1ab4b60f7a4f37fe"} Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.263332 4636 scope.go:117] "RemoveContainer" containerID="bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.263326 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pk6zb" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.281792 4636 scope.go:117] "RemoveContainer" containerID="bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132" Oct 03 14:09:32 crc kubenswrapper[4636]: E1003 14:09:32.282258 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132\": container with ID starting with bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132 not found: ID does not exist" containerID="bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.282291 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132"} err="failed to get container status \"bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132\": rpc error: code = NotFound desc = could not find container \"bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132\": container with ID starting with bcfef30fa5c1af62e4d71a887b182aa13e4760db7e643f19fdf3a43034f32132 not found: ID does not exist" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.287768 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-trusted-ca\") pod \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.287824 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-ca-trust-extracted\") pod \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.287843 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-certificates\") pod \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.287899 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh6fd\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-kube-api-access-vh6fd\") pod \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.287932 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-bound-sa-token\") pod \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.287969 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-tls\") pod \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.287997 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-installation-pull-secrets\") pod \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.288174 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\" (UID: \"967e0eca-11d1-4fb6-bba5-5fe993aaeac3\") " Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.289549 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "967e0eca-11d1-4fb6-bba5-5fe993aaeac3" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.290383 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "967e0eca-11d1-4fb6-bba5-5fe993aaeac3" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.294696 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-kube-api-access-vh6fd" (OuterVolumeSpecName: "kube-api-access-vh6fd") pod "967e0eca-11d1-4fb6-bba5-5fe993aaeac3" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3"). InnerVolumeSpecName "kube-api-access-vh6fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.294985 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "967e0eca-11d1-4fb6-bba5-5fe993aaeac3" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.295160 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "967e0eca-11d1-4fb6-bba5-5fe993aaeac3" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.298663 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "967e0eca-11d1-4fb6-bba5-5fe993aaeac3" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.300717 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "967e0eca-11d1-4fb6-bba5-5fe993aaeac3" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.305463 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "967e0eca-11d1-4fb6-bba5-5fe993aaeac3" (UID: "967e0eca-11d1-4fb6-bba5-5fe993aaeac3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.389486 4636 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.389527 4636 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.389540 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh6fd\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-kube-api-access-vh6fd\") on node \"crc\" DevicePath \"\"" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.389571 4636 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.389580 4636 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.389589 4636 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.389599 4636 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/967e0eca-11d1-4fb6-bba5-5fe993aaeac3-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.590163 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pk6zb"] Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.593222 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pk6zb"] Oct 03 14:09:32 crc kubenswrapper[4636]: I1003 14:09:32.801133 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="967e0eca-11d1-4fb6-bba5-5fe993aaeac3" path="/var/lib/kubelet/pods/967e0eca-11d1-4fb6-bba5-5fe993aaeac3/volumes" Oct 03 14:09:50 crc kubenswrapper[4636]: I1003 14:09:50.920056 4636 scope.go:117] "RemoveContainer" containerID="0086af18414e93472c560ec364dedea333390f0ffb9741d3fda61b6667a405d2" Oct 03 14:09:50 crc kubenswrapper[4636]: I1003 14:09:50.937566 4636 scope.go:117] "RemoveContainer" containerID="3aeb4fc0ddb342d056e5b7631307c27ffaea35e67b9cd36eba1735e149858d6f" Oct 03 14:10:09 crc kubenswrapper[4636]: I1003 14:10:09.163367 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:10:09 crc kubenswrapper[4636]: I1003 14:10:09.163916 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:10:39 crc kubenswrapper[4636]: I1003 14:10:39.163143 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:10:39 crc kubenswrapper[4636]: I1003 14:10:39.163706 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:10:50 crc kubenswrapper[4636]: I1003 14:10:50.981988 4636 scope.go:117] "RemoveContainer" containerID="85644ebf2f5e1d9d123df80960756d3d3cded380c1baa037a672094e20b782dd" Oct 03 14:10:51 crc kubenswrapper[4636]: I1003 14:10:51.005325 4636 scope.go:117] "RemoveContainer" containerID="e9550686548a68f686a24176919b69e1e5e977105d0ccc0749edb21e4c30460d" Oct 03 14:10:51 crc kubenswrapper[4636]: I1003 14:10:51.030439 4636 scope.go:117] "RemoveContainer" containerID="3b6c6f4b4b0a9020f5c8b5143e4223f2f550d043741c6dd089a09c4bbbe416eb" Oct 03 14:10:51 crc kubenswrapper[4636]: I1003 14:10:51.052431 4636 scope.go:117] "RemoveContainer" containerID="b2308df50a1e1371b46703c434942a85c8133dec183aba8035792ae55a49dc54" Oct 03 14:11:09 crc kubenswrapper[4636]: I1003 14:11:09.162908 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:11:09 crc kubenswrapper[4636]: I1003 14:11:09.163516 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:11:09 crc kubenswrapper[4636]: I1003 14:11:09.163578 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:11:09 crc kubenswrapper[4636]: I1003 14:11:09.164416 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"397ecbf6846cc3b94251ba0c02a817d6a89b5e1e5d3d2333691e31cc8372c3fc"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:11:09 crc kubenswrapper[4636]: I1003 14:11:09.164484 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://397ecbf6846cc3b94251ba0c02a817d6a89b5e1e5d3d2333691e31cc8372c3fc" gracePeriod=600 Oct 03 14:11:09 crc kubenswrapper[4636]: I1003 14:11:09.775545 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="397ecbf6846cc3b94251ba0c02a817d6a89b5e1e5d3d2333691e31cc8372c3fc" exitCode=0 Oct 03 14:11:09 crc kubenswrapper[4636]: I1003 14:11:09.775573 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"397ecbf6846cc3b94251ba0c02a817d6a89b5e1e5d3d2333691e31cc8372c3fc"} Oct 03 14:11:09 crc kubenswrapper[4636]: I1003 14:11:09.775826 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"8c343b2c3198b919be0641d5d289b1294e3d107e0057a5a4c2427bf1f447e7a9"} Oct 03 14:11:09 crc kubenswrapper[4636]: I1003 14:11:09.775843 4636 scope.go:117] "RemoveContainer" containerID="5c0c4ea124622f317166a6ea5cb84988a6632a919c472664da27e020c7262591" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.602753 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-lzr2w"] Oct 03 14:11:38 crc kubenswrapper[4636]: E1003 14:11:38.604266 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="967e0eca-11d1-4fb6-bba5-5fe993aaeac3" containerName="registry" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.604331 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="967e0eca-11d1-4fb6-bba5-5fe993aaeac3" containerName="registry" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.604492 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="967e0eca-11d1-4fb6-bba5-5fe993aaeac3" containerName="registry" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.604917 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-lzr2w" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.607451 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.608133 4636 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-j62zv" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.609008 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.626898 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-lzr2w"] Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.630593 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-tswd6"] Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.631385 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-tswd6" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.633278 4636 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-sdqtf" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.655214 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-jw6vl"] Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.655954 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-jw6vl" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.657598 4636 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-p86ws" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.659326 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-tswd6"] Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.663002 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-jw6vl"] Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.694871 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vj5g\" (UniqueName: \"kubernetes.io/projected/d933c0ac-7ab5-4b2f-9602-5b277d92679e-kube-api-access-4vj5g\") pod \"cert-manager-webhook-5655c58dd6-jw6vl\" (UID: \"d933c0ac-7ab5-4b2f-9602-5b277d92679e\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-jw6vl" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.694933 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsgnh\" (UniqueName: \"kubernetes.io/projected/be83bffc-d4e8-469a-85d9-6cc8ec6b64f4-kube-api-access-dsgnh\") pod \"cert-manager-5b446d88c5-tswd6\" (UID: \"be83bffc-d4e8-469a-85d9-6cc8ec6b64f4\") " pod="cert-manager/cert-manager-5b446d88c5-tswd6" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.694993 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8d2t\" (UniqueName: \"kubernetes.io/projected/2974bed1-bc60-45f9-a4ce-42f14db27998-kube-api-access-s8d2t\") pod \"cert-manager-cainjector-7f985d654d-lzr2w\" (UID: \"2974bed1-bc60-45f9-a4ce-42f14db27998\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-lzr2w" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.795717 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vj5g\" (UniqueName: \"kubernetes.io/projected/d933c0ac-7ab5-4b2f-9602-5b277d92679e-kube-api-access-4vj5g\") pod \"cert-manager-webhook-5655c58dd6-jw6vl\" (UID: \"d933c0ac-7ab5-4b2f-9602-5b277d92679e\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-jw6vl" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.795829 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsgnh\" (UniqueName: \"kubernetes.io/projected/be83bffc-d4e8-469a-85d9-6cc8ec6b64f4-kube-api-access-dsgnh\") pod \"cert-manager-5b446d88c5-tswd6\" (UID: \"be83bffc-d4e8-469a-85d9-6cc8ec6b64f4\") " pod="cert-manager/cert-manager-5b446d88c5-tswd6" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.795884 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8d2t\" (UniqueName: \"kubernetes.io/projected/2974bed1-bc60-45f9-a4ce-42f14db27998-kube-api-access-s8d2t\") pod \"cert-manager-cainjector-7f985d654d-lzr2w\" (UID: \"2974bed1-bc60-45f9-a4ce-42f14db27998\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-lzr2w" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.816590 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8d2t\" (UniqueName: \"kubernetes.io/projected/2974bed1-bc60-45f9-a4ce-42f14db27998-kube-api-access-s8d2t\") pod \"cert-manager-cainjector-7f985d654d-lzr2w\" (UID: \"2974bed1-bc60-45f9-a4ce-42f14db27998\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-lzr2w" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.821284 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsgnh\" (UniqueName: \"kubernetes.io/projected/be83bffc-d4e8-469a-85d9-6cc8ec6b64f4-kube-api-access-dsgnh\") pod \"cert-manager-5b446d88c5-tswd6\" (UID: \"be83bffc-d4e8-469a-85d9-6cc8ec6b64f4\") " pod="cert-manager/cert-manager-5b446d88c5-tswd6" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.821957 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vj5g\" (UniqueName: \"kubernetes.io/projected/d933c0ac-7ab5-4b2f-9602-5b277d92679e-kube-api-access-4vj5g\") pod \"cert-manager-webhook-5655c58dd6-jw6vl\" (UID: \"d933c0ac-7ab5-4b2f-9602-5b277d92679e\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-jw6vl" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.923717 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-lzr2w" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.948979 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-tswd6" Oct 03 14:11:38 crc kubenswrapper[4636]: I1003 14:11:38.974841 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-jw6vl" Oct 03 14:11:39 crc kubenswrapper[4636]: I1003 14:11:39.140526 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-lzr2w"] Oct 03 14:11:39 crc kubenswrapper[4636]: I1003 14:11:39.156234 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:11:39 crc kubenswrapper[4636]: I1003 14:11:39.193197 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-tswd6"] Oct 03 14:11:39 crc kubenswrapper[4636]: I1003 14:11:39.255488 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-jw6vl"] Oct 03 14:11:39 crc kubenswrapper[4636]: I1003 14:11:39.914988 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-tswd6" event={"ID":"be83bffc-d4e8-469a-85d9-6cc8ec6b64f4","Type":"ContainerStarted","Data":"1c41bf78811a6263f222864db919b1e81aed5d43c42ada53587a8c3bea5fb8b8"} Oct 03 14:11:39 crc kubenswrapper[4636]: I1003 14:11:39.916482 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-jw6vl" event={"ID":"d933c0ac-7ab5-4b2f-9602-5b277d92679e","Type":"ContainerStarted","Data":"22c8f2c22eedf56a2af63ef59274f2a38b140b7c813223ea1f84f3a56e3361dc"} Oct 03 14:11:39 crc kubenswrapper[4636]: I1003 14:11:39.917832 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-lzr2w" event={"ID":"2974bed1-bc60-45f9-a4ce-42f14db27998","Type":"ContainerStarted","Data":"af4ce67598c4a991fcf4aeb5d20c19a5d73196f24ae7befbdbd8591c37089f59"} Oct 03 14:11:42 crc kubenswrapper[4636]: I1003 14:11:42.933514 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-jw6vl" event={"ID":"d933c0ac-7ab5-4b2f-9602-5b277d92679e","Type":"ContainerStarted","Data":"71c4ac6f605c313526291b421615fb12a0349bc34fc5d876b1b97b6094a223a5"} Oct 03 14:11:42 crc kubenswrapper[4636]: I1003 14:11:42.934053 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-jw6vl" Oct 03 14:11:42 crc kubenswrapper[4636]: I1003 14:11:42.935219 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-lzr2w" event={"ID":"2974bed1-bc60-45f9-a4ce-42f14db27998","Type":"ContainerStarted","Data":"c9a08411f24ce1a10c68213d271869304b83c2b8e1275e78b906e96fb9c0c9be"} Oct 03 14:11:42 crc kubenswrapper[4636]: I1003 14:11:42.936489 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-tswd6" event={"ID":"be83bffc-d4e8-469a-85d9-6cc8ec6b64f4","Type":"ContainerStarted","Data":"027da6775c8c017aeb819617323042adad9e7bf2bbb4f79b84bbcdf5afa5f501"} Oct 03 14:11:42 crc kubenswrapper[4636]: I1003 14:11:42.952143 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-jw6vl" podStartSLOduration=1.677126945 podStartE2EDuration="4.952125058s" podCreationTimestamp="2025-10-03 14:11:38 +0000 UTC" firstStartedPulling="2025-10-03 14:11:39.263538281 +0000 UTC m=+649.122264528" lastFinishedPulling="2025-10-03 14:11:42.538536404 +0000 UTC m=+652.397262641" observedRunningTime="2025-10-03 14:11:42.947338655 +0000 UTC m=+652.806064922" watchObservedRunningTime="2025-10-03 14:11:42.952125058 +0000 UTC m=+652.810851305" Oct 03 14:11:42 crc kubenswrapper[4636]: I1003 14:11:42.963308 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-lzr2w" podStartSLOduration=1.6366506699999999 podStartE2EDuration="4.963292787s" podCreationTimestamp="2025-10-03 14:11:38 +0000 UTC" firstStartedPulling="2025-10-03 14:11:39.155645634 +0000 UTC m=+649.014371881" lastFinishedPulling="2025-10-03 14:11:42.482287751 +0000 UTC m=+652.341013998" observedRunningTime="2025-10-03 14:11:42.962072485 +0000 UTC m=+652.820798742" watchObservedRunningTime="2025-10-03 14:11:42.963292787 +0000 UTC m=+652.822019034" Oct 03 14:11:48 crc kubenswrapper[4636]: I1003 14:11:48.911582 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-tswd6" podStartSLOduration=7.636829572 podStartE2EDuration="10.911564338s" podCreationTimestamp="2025-10-03 14:11:38 +0000 UTC" firstStartedPulling="2025-10-03 14:11:39.205861201 +0000 UTC m=+649.064587438" lastFinishedPulling="2025-10-03 14:11:42.480595967 +0000 UTC m=+652.339322204" observedRunningTime="2025-10-03 14:11:42.980494881 +0000 UTC m=+652.839221148" watchObservedRunningTime="2025-10-03 14:11:48.911564338 +0000 UTC m=+658.770290585" Oct 03 14:11:48 crc kubenswrapper[4636]: I1003 14:11:48.915628 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t7xd5"] Oct 03 14:11:48 crc kubenswrapper[4636]: I1003 14:11:48.915997 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovn-controller" containerID="cri-o://c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4" gracePeriod=30 Oct 03 14:11:48 crc kubenswrapper[4636]: I1003 14:11:48.916390 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="sbdb" containerID="cri-o://76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362" gracePeriod=30 Oct 03 14:11:48 crc kubenswrapper[4636]: I1003 14:11:48.916435 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="nbdb" containerID="cri-o://5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422" gracePeriod=30 Oct 03 14:11:48 crc kubenswrapper[4636]: I1003 14:11:48.916465 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="northd" containerID="cri-o://9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4" gracePeriod=30 Oct 03 14:11:48 crc kubenswrapper[4636]: I1003 14:11:48.916493 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080" gracePeriod=30 Oct 03 14:11:48 crc kubenswrapper[4636]: I1003 14:11:48.916522 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="kube-rbac-proxy-node" containerID="cri-o://63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc" gracePeriod=30 Oct 03 14:11:48 crc kubenswrapper[4636]: I1003 14:11:48.916549 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovn-acl-logging" containerID="cri-o://552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847" gracePeriod=30 Oct 03 14:11:48 crc kubenswrapper[4636]: I1003 14:11:48.977308 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" containerID="cri-o://76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1" gracePeriod=30 Oct 03 14:11:48 crc kubenswrapper[4636]: I1003 14:11:48.978519 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-jw6vl" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.272220 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/3.log" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.273920 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovn-acl-logging/0.log" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.274320 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovn-controller/0.log" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.274762 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.320811 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8ljr9"] Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321118 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="kubecfg-setup" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321133 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="kubecfg-setup" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321148 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovn-acl-logging" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321216 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovn-acl-logging" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321238 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321251 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321261 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321268 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321276 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="kube-rbac-proxy-node" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321283 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="kube-rbac-proxy-node" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321297 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="northd" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321303 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="northd" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321313 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321320 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321326 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="sbdb" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321332 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="sbdb" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321342 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovn-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321347 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovn-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321354 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321360 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321369 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321375 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321383 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="nbdb" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321389 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="nbdb" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321483 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovn-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321493 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321499 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321506 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="northd" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321513 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovn-acl-logging" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321523 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="kube-rbac-proxy-node" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321532 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="nbdb" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321539 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321545 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321552 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="sbdb" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.321636 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321643 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321722 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.321731 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" containerName="ovnkube-controller" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.323202 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.329417 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-systemd-units\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.329649 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/534ddbf3-3b40-4541-9951-ffb0e7668fb3-ovnkube-config\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.329751 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-slash\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.329858 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.329955 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-run-netns\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330067 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-run-ovn\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330122 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-cni-bin\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330152 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-var-lib-openvswitch\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330175 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-run-systemd\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330196 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-cni-netd\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330233 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-log-socket\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330259 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-etc-openvswitch\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330311 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-node-log\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330350 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330383 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/534ddbf3-3b40-4541-9951-ffb0e7668fb3-env-overrides\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330398 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-kubelet\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.330424 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-run-openvswitch\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.430900 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-ovn\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.430939 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-systemd\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.430966 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-bin\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.430985 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-netd\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431004 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431023 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p9qj\" (UniqueName: \"kubernetes.io/projected/564529e3-ff40-4923-9f6d-319a9b41720a-kube-api-access-2p9qj\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431039 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431047 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-log-socket\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431059 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431072 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431118 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-slash\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431154 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-openvswitch\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431174 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-kubelet\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431198 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-node-log\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431220 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/564529e3-ff40-4923-9f6d-319a9b41720a-ovn-node-metrics-cert\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431237 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-systemd-units\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431257 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-env-overrides\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431277 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-var-lib-openvswitch\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431298 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-etc-openvswitch\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431319 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-script-lib\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431344 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-config\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431366 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-netns\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431387 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-ovn-kubernetes\") pod \"564529e3-ff40-4923-9f6d-319a9b41720a\" (UID: \"564529e3-ff40-4923-9f6d-319a9b41720a\") " Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431459 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/534ddbf3-3b40-4541-9951-ffb0e7668fb3-ovnkube-script-lib\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431491 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431523 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/534ddbf3-3b40-4541-9951-ffb0e7668fb3-env-overrides\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431546 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-kubelet\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431576 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-run-openvswitch\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431604 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df5kw\" (UniqueName: \"kubernetes.io/projected/534ddbf3-3b40-4541-9951-ffb0e7668fb3-kube-api-access-df5kw\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431631 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-systemd-units\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431649 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/534ddbf3-3b40-4541-9951-ffb0e7668fb3-ovnkube-config\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431666 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-slash\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431692 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431747 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-run-netns\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431772 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-run-ovn\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431796 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-cni-bin\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431825 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-var-lib-openvswitch\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431886 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-run-systemd\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431909 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-cni-netd\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431937 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-log-socket\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431958 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-etc-openvswitch\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432012 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/534ddbf3-3b40-4541-9951-ffb0e7668fb3-ovn-node-metrics-cert\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432041 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-node-log\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432084 4636 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432153 4636 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432169 4636 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432213 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-node-log\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432654 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-slash\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432695 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432724 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-run-netns\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432882 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-run-ovn\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431330 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432912 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-cni-bin\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431363 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-log-socket" (OuterVolumeSpecName: "log-socket") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431380 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431396 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-slash" (OuterVolumeSpecName: "host-slash") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431417 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432936 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/534ddbf3-3b40-4541-9951-ffb0e7668fb3-ovnkube-config\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432962 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-run-systemd\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431433 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431448 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-node-log" (OuterVolumeSpecName: "node-log") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.431864 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.433000 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-kubelet\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432256 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432289 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.433025 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-cni-netd\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432941 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-var-lib-openvswitch\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.433043 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-systemd-units\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.433016 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-run-openvswitch\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432582 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432876 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432962 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.432978 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.433085 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-etc-openvswitch\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.433089 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-log-socket\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.433150 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/534ddbf3-3b40-4541-9951-ffb0e7668fb3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.433436 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/534ddbf3-3b40-4541-9951-ffb0e7668fb3-env-overrides\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.435898 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564529e3-ff40-4923-9f6d-319a9b41720a-kube-api-access-2p9qj" (OuterVolumeSpecName: "kube-api-access-2p9qj") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "kube-api-access-2p9qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.436160 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564529e3-ff40-4923-9f6d-319a9b41720a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.444119 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "564529e3-ff40-4923-9f6d-319a9b41720a" (UID: "564529e3-ff40-4923-9f6d-319a9b41720a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532692 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/534ddbf3-3b40-4541-9951-ffb0e7668fb3-ovn-node-metrics-cert\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532735 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/534ddbf3-3b40-4541-9951-ffb0e7668fb3-ovnkube-script-lib\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532776 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df5kw\" (UniqueName: \"kubernetes.io/projected/534ddbf3-3b40-4541-9951-ffb0e7668fb3-kube-api-access-df5kw\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532833 4636 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532845 4636 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-slash\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532854 4636 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532862 4636 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532872 4636 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-node-log\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532880 4636 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/564529e3-ff40-4923-9f6d-319a9b41720a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532888 4636 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532896 4636 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532904 4636 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532912 4636 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532921 4636 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532929 4636 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/564529e3-ff40-4923-9f6d-319a9b41720a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532936 4636 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532944 4636 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532952 4636 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532959 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p9qj\" (UniqueName: \"kubernetes.io/projected/564529e3-ff40-4923-9f6d-319a9b41720a-kube-api-access-2p9qj\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.532968 4636 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/564529e3-ff40-4923-9f6d-319a9b41720a-log-socket\") on node \"crc\" DevicePath \"\"" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.533645 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/534ddbf3-3b40-4541-9951-ffb0e7668fb3-ovnkube-script-lib\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.536685 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/534ddbf3-3b40-4541-9951-ffb0e7668fb3-ovn-node-metrics-cert\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.546369 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df5kw\" (UniqueName: \"kubernetes.io/projected/534ddbf3-3b40-4541-9951-ffb0e7668fb3-kube-api-access-df5kw\") pod \"ovnkube-node-8ljr9\" (UID: \"534ddbf3-3b40-4541-9951-ffb0e7668fb3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.636516 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.974874 4636 generic.go:334] "Generic (PLEG): container finished" podID="534ddbf3-3b40-4541-9951-ffb0e7668fb3" containerID="f5d9e648cfc8a032de6c90da3bbe09844465369f768b07b7485e4dcd9239bcb4" exitCode=0 Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.974915 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" event={"ID":"534ddbf3-3b40-4541-9951-ffb0e7668fb3","Type":"ContainerDied","Data":"f5d9e648cfc8a032de6c90da3bbe09844465369f768b07b7485e4dcd9239bcb4"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.976014 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" event={"ID":"534ddbf3-3b40-4541-9951-ffb0e7668fb3","Type":"ContainerStarted","Data":"96bb8479acd70b891e41fc3c4b262a1fcc4dfa587b34d3826b7d0c2c6b94e40d"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.977748 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovnkube-controller/3.log" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.979809 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovn-acl-logging/0.log" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.980640 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7xd5_564529e3-ff40-4923-9f6d-319a9b41720a/ovn-controller/0.log" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981029 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1" exitCode=0 Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981057 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362" exitCode=0 Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981068 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422" exitCode=0 Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981078 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4" exitCode=0 Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981089 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080" exitCode=0 Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981149 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc" exitCode=0 Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981158 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847" exitCode=143 Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981158 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981192 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981233 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981253 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981265 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981276 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981315 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981329 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981335 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981340 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981345 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981350 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981354 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981360 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981365 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981393 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981401 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981407 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981412 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981417 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981422 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981426 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981431 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981436 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981441 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981463 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981472 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981480 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981486 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981490 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981496 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981501 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981508 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981515 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981521 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981549 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981556 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981593 4636 scope.go:117] "RemoveContainer" containerID="76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981167 4636 generic.go:334] "Generic (PLEG): container finished" podID="564529e3-ff40-4923-9f6d-319a9b41720a" containerID="c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4" exitCode=143 Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981746 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" event={"ID":"564529e3-ff40-4923-9f6d-319a9b41720a","Type":"ContainerDied","Data":"44658c054ab5f0148595b86dd04430dc1277d05a6f5bf8018cdf03f7318586fa"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981764 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981798 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981806 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981812 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981819 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981824 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981830 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981837 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981843 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981871 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.981998 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7xd5" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.986534 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltsq6_140a698f-2661-4dc8-86d9-929b0d6dd326/kube-multus/2.log" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.986973 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltsq6_140a698f-2661-4dc8-86d9-929b0d6dd326/kube-multus/1.log" Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.987018 4636 generic.go:334] "Generic (PLEG): container finished" podID="140a698f-2661-4dc8-86d9-929b0d6dd326" containerID="1b6aa2e19ac2f9f087fab0b525d8c3d4b09b610b1fa0aa8608d6083dcd243173" exitCode=2 Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.987048 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltsq6" event={"ID":"140a698f-2661-4dc8-86d9-929b0d6dd326","Type":"ContainerDied","Data":"1b6aa2e19ac2f9f087fab0b525d8c3d4b09b610b1fa0aa8608d6083dcd243173"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.987069 4636 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85"} Oct 03 14:11:49 crc kubenswrapper[4636]: I1003 14:11:49.987495 4636 scope.go:117] "RemoveContainer" containerID="1b6aa2e19ac2f9f087fab0b525d8c3d4b09b610b1fa0aa8608d6083dcd243173" Oct 03 14:11:49 crc kubenswrapper[4636]: E1003 14:11:49.987665 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ltsq6_openshift-multus(140a698f-2661-4dc8-86d9-929b0d6dd326)\"" pod="openshift-multus/multus-ltsq6" podUID="140a698f-2661-4dc8-86d9-929b0d6dd326" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.007626 4636 scope.go:117] "RemoveContainer" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.022667 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t7xd5"] Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.031506 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t7xd5"] Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.035648 4636 scope.go:117] "RemoveContainer" containerID="76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.085806 4636 scope.go:117] "RemoveContainer" containerID="5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.102846 4636 scope.go:117] "RemoveContainer" containerID="9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.163403 4636 scope.go:117] "RemoveContainer" containerID="a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.181591 4636 scope.go:117] "RemoveContainer" containerID="63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.203023 4636 scope.go:117] "RemoveContainer" containerID="552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.215746 4636 scope.go:117] "RemoveContainer" containerID="c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.229509 4636 scope.go:117] "RemoveContainer" containerID="0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.243325 4636 scope.go:117] "RemoveContainer" containerID="76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1" Oct 03 14:11:50 crc kubenswrapper[4636]: E1003 14:11:50.244561 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1\": container with ID starting with 76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1 not found: ID does not exist" containerID="76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.244790 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1"} err="failed to get container status \"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1\": rpc error: code = NotFound desc = could not find container \"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1\": container with ID starting with 76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.244882 4636 scope.go:117] "RemoveContainer" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" Oct 03 14:11:50 crc kubenswrapper[4636]: E1003 14:11:50.245243 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\": container with ID starting with 1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef not found: ID does not exist" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.245348 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef"} err="failed to get container status \"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\": rpc error: code = NotFound desc = could not find container \"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\": container with ID starting with 1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.245433 4636 scope.go:117] "RemoveContainer" containerID="76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362" Oct 03 14:11:50 crc kubenswrapper[4636]: E1003 14:11:50.245812 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\": container with ID starting with 76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362 not found: ID does not exist" containerID="76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.245926 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362"} err="failed to get container status \"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\": rpc error: code = NotFound desc = could not find container \"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\": container with ID starting with 76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.246026 4636 scope.go:117] "RemoveContainer" containerID="5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422" Oct 03 14:11:50 crc kubenswrapper[4636]: E1003 14:11:50.246357 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\": container with ID starting with 5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422 not found: ID does not exist" containerID="5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.246462 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422"} err="failed to get container status \"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\": rpc error: code = NotFound desc = could not find container \"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\": container with ID starting with 5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.246558 4636 scope.go:117] "RemoveContainer" containerID="9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4" Oct 03 14:11:50 crc kubenswrapper[4636]: E1003 14:11:50.246858 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\": container with ID starting with 9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4 not found: ID does not exist" containerID="9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.246978 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4"} err="failed to get container status \"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\": rpc error: code = NotFound desc = could not find container \"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\": container with ID starting with 9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.247079 4636 scope.go:117] "RemoveContainer" containerID="a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080" Oct 03 14:11:50 crc kubenswrapper[4636]: E1003 14:11:50.247364 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\": container with ID starting with a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080 not found: ID does not exist" containerID="a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.247516 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080"} err="failed to get container status \"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\": rpc error: code = NotFound desc = could not find container \"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\": container with ID starting with a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.247603 4636 scope.go:117] "RemoveContainer" containerID="63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc" Oct 03 14:11:50 crc kubenswrapper[4636]: E1003 14:11:50.247905 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\": container with ID starting with 63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc not found: ID does not exist" containerID="63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.248006 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc"} err="failed to get container status \"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\": rpc error: code = NotFound desc = could not find container \"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\": container with ID starting with 63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.248764 4636 scope.go:117] "RemoveContainer" containerID="552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847" Oct 03 14:11:50 crc kubenswrapper[4636]: E1003 14:11:50.249315 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\": container with ID starting with 552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847 not found: ID does not exist" containerID="552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.249501 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847"} err="failed to get container status \"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\": rpc error: code = NotFound desc = could not find container \"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\": container with ID starting with 552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.249593 4636 scope.go:117] "RemoveContainer" containerID="c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4" Oct 03 14:11:50 crc kubenswrapper[4636]: E1003 14:11:50.250172 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\": container with ID starting with c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4 not found: ID does not exist" containerID="c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.250212 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4"} err="failed to get container status \"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\": rpc error: code = NotFound desc = could not find container \"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\": container with ID starting with c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.250243 4636 scope.go:117] "RemoveContainer" containerID="0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80" Oct 03 14:11:50 crc kubenswrapper[4636]: E1003 14:11:50.250584 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\": container with ID starting with 0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80 not found: ID does not exist" containerID="0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.250697 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80"} err="failed to get container status \"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\": rpc error: code = NotFound desc = could not find container \"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\": container with ID starting with 0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.250788 4636 scope.go:117] "RemoveContainer" containerID="76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.251555 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1"} err="failed to get container status \"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1\": rpc error: code = NotFound desc = could not find container \"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1\": container with ID starting with 76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.251579 4636 scope.go:117] "RemoveContainer" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.253051 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef"} err="failed to get container status \"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\": rpc error: code = NotFound desc = could not find container \"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\": container with ID starting with 1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.253300 4636 scope.go:117] "RemoveContainer" containerID="76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.255034 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362"} err="failed to get container status \"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\": rpc error: code = NotFound desc = could not find container \"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\": container with ID starting with 76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.255060 4636 scope.go:117] "RemoveContainer" containerID="5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.255368 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422"} err="failed to get container status \"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\": rpc error: code = NotFound desc = could not find container \"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\": container with ID starting with 5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.255550 4636 scope.go:117] "RemoveContainer" containerID="9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.255918 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4"} err="failed to get container status \"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\": rpc error: code = NotFound desc = could not find container \"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\": container with ID starting with 9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.256033 4636 scope.go:117] "RemoveContainer" containerID="a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.256624 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080"} err="failed to get container status \"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\": rpc error: code = NotFound desc = could not find container \"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\": container with ID starting with a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.256737 4636 scope.go:117] "RemoveContainer" containerID="63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.257069 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc"} err="failed to get container status \"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\": rpc error: code = NotFound desc = could not find container \"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\": container with ID starting with 63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.257090 4636 scope.go:117] "RemoveContainer" containerID="552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.257367 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847"} err="failed to get container status \"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\": rpc error: code = NotFound desc = could not find container \"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\": container with ID starting with 552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.257460 4636 scope.go:117] "RemoveContainer" containerID="c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.257746 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4"} err="failed to get container status \"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\": rpc error: code = NotFound desc = could not find container \"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\": container with ID starting with c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.257840 4636 scope.go:117] "RemoveContainer" containerID="0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.258203 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80"} err="failed to get container status \"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\": rpc error: code = NotFound desc = could not find container \"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\": container with ID starting with 0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.258345 4636 scope.go:117] "RemoveContainer" containerID="76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.258663 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1"} err="failed to get container status \"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1\": rpc error: code = NotFound desc = could not find container \"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1\": container with ID starting with 76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.258750 4636 scope.go:117] "RemoveContainer" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.259017 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef"} err="failed to get container status \"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\": rpc error: code = NotFound desc = could not find container \"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\": container with ID starting with 1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.259215 4636 scope.go:117] "RemoveContainer" containerID="76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.259524 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362"} err="failed to get container status \"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\": rpc error: code = NotFound desc = could not find container \"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\": container with ID starting with 76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.259617 4636 scope.go:117] "RemoveContainer" containerID="5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.259865 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422"} err="failed to get container status \"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\": rpc error: code = NotFound desc = could not find container \"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\": container with ID starting with 5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.259990 4636 scope.go:117] "RemoveContainer" containerID="9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.260262 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4"} err="failed to get container status \"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\": rpc error: code = NotFound desc = could not find container \"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\": container with ID starting with 9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.260355 4636 scope.go:117] "RemoveContainer" containerID="a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.260677 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080"} err="failed to get container status \"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\": rpc error: code = NotFound desc = could not find container \"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\": container with ID starting with a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.260764 4636 scope.go:117] "RemoveContainer" containerID="63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.261062 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc"} err="failed to get container status \"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\": rpc error: code = NotFound desc = could not find container \"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\": container with ID starting with 63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.261221 4636 scope.go:117] "RemoveContainer" containerID="552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.261607 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847"} err="failed to get container status \"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\": rpc error: code = NotFound desc = could not find container \"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\": container with ID starting with 552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.261648 4636 scope.go:117] "RemoveContainer" containerID="c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.261932 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4"} err="failed to get container status \"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\": rpc error: code = NotFound desc = could not find container \"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\": container with ID starting with c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.262116 4636 scope.go:117] "RemoveContainer" containerID="0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.262387 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80"} err="failed to get container status \"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\": rpc error: code = NotFound desc = could not find container \"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\": container with ID starting with 0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.262498 4636 scope.go:117] "RemoveContainer" containerID="76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.262789 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1"} err="failed to get container status \"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1\": rpc error: code = NotFound desc = could not find container \"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1\": container with ID starting with 76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.262915 4636 scope.go:117] "RemoveContainer" containerID="1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.263554 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef"} err="failed to get container status \"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\": rpc error: code = NotFound desc = could not find container \"1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef\": container with ID starting with 1c1321d23d631390eb9d30e8ffdd7b6cafa7c3b4ab38413183ec37d5e0c212ef not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.263651 4636 scope.go:117] "RemoveContainer" containerID="76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.263947 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362"} err="failed to get container status \"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\": rpc error: code = NotFound desc = could not find container \"76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362\": container with ID starting with 76c7eff0a6d11dae7b2f32ebe8e2033983d8ac3388bb87a63f077f484b953362 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.264032 4636 scope.go:117] "RemoveContainer" containerID="5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.264378 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422"} err="failed to get container status \"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\": rpc error: code = NotFound desc = could not find container \"5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422\": container with ID starting with 5ff7f9dbd87d9556d997ebb5df0dbc21ba58664197b5f0ed5d07e0558d901422 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.264560 4636 scope.go:117] "RemoveContainer" containerID="9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.268256 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4"} err="failed to get container status \"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\": rpc error: code = NotFound desc = could not find container \"9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4\": container with ID starting with 9e402ac35e232ad12a8eec47662a1430e80b51176a34bf80fe46522deeddafc4 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.268295 4636 scope.go:117] "RemoveContainer" containerID="a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.268704 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080"} err="failed to get container status \"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\": rpc error: code = NotFound desc = could not find container \"a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080\": container with ID starting with a39726e63294a4976373fef27fc8bddbf2405734633325c10127c67bb79d4080 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.268844 4636 scope.go:117] "RemoveContainer" containerID="63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.269142 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc"} err="failed to get container status \"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\": rpc error: code = NotFound desc = could not find container \"63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc\": container with ID starting with 63051b2c4f113870925e4d10e1411ebc7421f5d9711068601234c45042c100fc not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.269279 4636 scope.go:117] "RemoveContainer" containerID="552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.269545 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847"} err="failed to get container status \"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\": rpc error: code = NotFound desc = could not find container \"552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847\": container with ID starting with 552d7d309753b6060975ad6c5b009f14e7581026c23ea469a1c32eaf4c382847 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.269656 4636 scope.go:117] "RemoveContainer" containerID="c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.269977 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4"} err="failed to get container status \"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\": rpc error: code = NotFound desc = could not find container \"c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4\": container with ID starting with c0373ef9bca9a4b7848d70f68838054adc97bb9042991e8fb3a7f1f60d4c82a4 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.270091 4636 scope.go:117] "RemoveContainer" containerID="0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.270463 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80"} err="failed to get container status \"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\": rpc error: code = NotFound desc = could not find container \"0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80\": container with ID starting with 0935d52e27c7b015d4ac8f6b7af8b189fdac20cd3203800f5ad5355816b8ea80 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.270601 4636 scope.go:117] "RemoveContainer" containerID="76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.270854 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1"} err="failed to get container status \"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1\": rpc error: code = NotFound desc = could not find container \"76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1\": container with ID starting with 76252b931d684f1792e58c44c36a1aaebd59ace587f5dbf3e68335878b53d7a1 not found: ID does not exist" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.799861 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564529e3-ff40-4923-9f6d-319a9b41720a" path="/var/lib/kubelet/pods/564529e3-ff40-4923-9f6d-319a9b41720a/volumes" Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.994647 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" event={"ID":"534ddbf3-3b40-4541-9951-ffb0e7668fb3","Type":"ContainerStarted","Data":"b56668a54d67c24ad45901a464fb7ad193b6e4c805a916bb0439a97df8a92d10"} Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.994685 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" event={"ID":"534ddbf3-3b40-4541-9951-ffb0e7668fb3","Type":"ContainerStarted","Data":"56ae9506ae0e4c727813ce07e4e3f98e509d0128ea93fcfc421d56639e0ce459"} Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.994695 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" event={"ID":"534ddbf3-3b40-4541-9951-ffb0e7668fb3","Type":"ContainerStarted","Data":"48b4680df182addf32e8c354e323a7ea363235d6faa6f6bd26922547e632cb98"} Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.994704 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" event={"ID":"534ddbf3-3b40-4541-9951-ffb0e7668fb3","Type":"ContainerStarted","Data":"48d034ac7381f9fcd1e7d85574b4916ef393fe45335eb8c9906bb507eff51606"} Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.994712 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" event={"ID":"534ddbf3-3b40-4541-9951-ffb0e7668fb3","Type":"ContainerStarted","Data":"3d19a4c273dd5320ca041606d1ad7a273d5a0ea55418bc30297a9a15ae16bd82"} Oct 03 14:11:50 crc kubenswrapper[4636]: I1003 14:11:50.994719 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" event={"ID":"534ddbf3-3b40-4541-9951-ffb0e7668fb3","Type":"ContainerStarted","Data":"38d0f94b8c7e77ed54ad6ddf1363d692a48966bcd79123f6907a79d7719803bd"} Oct 03 14:11:51 crc kubenswrapper[4636]: I1003 14:11:51.084812 4636 scope.go:117] "RemoveContainer" containerID="f09d19aad0b3dd34eb48df35bc872b186fab30f7d6dc9fae25b3fa3b5b2c1d85" Oct 03 14:11:52 crc kubenswrapper[4636]: I1003 14:11:52.000558 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltsq6_140a698f-2661-4dc8-86d9-929b0d6dd326/kube-multus/2.log" Oct 03 14:11:53 crc kubenswrapper[4636]: I1003 14:11:53.009960 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" event={"ID":"534ddbf3-3b40-4541-9951-ffb0e7668fb3","Type":"ContainerStarted","Data":"6ea81573131a5a62c828aa49e3f5ba00996e03f1da1f13374c1c9936b99e8415"} Oct 03 14:11:56 crc kubenswrapper[4636]: I1003 14:11:56.031359 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" event={"ID":"534ddbf3-3b40-4541-9951-ffb0e7668fb3","Type":"ContainerStarted","Data":"4a50d435758d8ba55f4a703fe69fa679464c4c5599388e181f7599fbe9c98daf"} Oct 03 14:11:56 crc kubenswrapper[4636]: I1003 14:11:56.031929 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:56 crc kubenswrapper[4636]: I1003 14:11:56.031947 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:56 crc kubenswrapper[4636]: I1003 14:11:56.058285 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:56 crc kubenswrapper[4636]: I1003 14:11:56.061301 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" podStartSLOduration=7.061279468 podStartE2EDuration="7.061279468s" podCreationTimestamp="2025-10-03 14:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:11:56.057468779 +0000 UTC m=+665.916195036" watchObservedRunningTime="2025-10-03 14:11:56.061279468 +0000 UTC m=+665.920005715" Oct 03 14:11:57 crc kubenswrapper[4636]: I1003 14:11:57.035747 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:11:57 crc kubenswrapper[4636]: I1003 14:11:57.058898 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:12:00 crc kubenswrapper[4636]: I1003 14:12:00.796492 4636 scope.go:117] "RemoveContainer" containerID="1b6aa2e19ac2f9f087fab0b525d8c3d4b09b610b1fa0aa8608d6083dcd243173" Oct 03 14:12:00 crc kubenswrapper[4636]: E1003 14:12:00.796964 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ltsq6_openshift-multus(140a698f-2661-4dc8-86d9-929b0d6dd326)\"" pod="openshift-multus/multus-ltsq6" podUID="140a698f-2661-4dc8-86d9-929b0d6dd326" Oct 03 14:12:12 crc kubenswrapper[4636]: I1003 14:12:12.793758 4636 scope.go:117] "RemoveContainer" containerID="1b6aa2e19ac2f9f087fab0b525d8c3d4b09b610b1fa0aa8608d6083dcd243173" Oct 03 14:12:14 crc kubenswrapper[4636]: I1003 14:12:14.119469 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltsq6_140a698f-2661-4dc8-86d9-929b0d6dd326/kube-multus/2.log" Oct 03 14:12:14 crc kubenswrapper[4636]: I1003 14:12:14.120601 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltsq6" event={"ID":"140a698f-2661-4dc8-86d9-929b0d6dd326","Type":"ContainerStarted","Data":"0c6b642e1846010eb97d8fac75b0c08b41ebd7434c1c66688e4967ed62563e7a"} Oct 03 14:12:19 crc kubenswrapper[4636]: I1003 14:12:19.661032 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ljr9" Oct 03 14:12:32 crc kubenswrapper[4636]: I1003 14:12:32.948999 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr"] Oct 03 14:12:32 crc kubenswrapper[4636]: I1003 14:12:32.950493 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:32 crc kubenswrapper[4636]: I1003 14:12:32.952650 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 14:12:32 crc kubenswrapper[4636]: I1003 14:12:32.958474 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr"] Oct 03 14:12:33 crc kubenswrapper[4636]: I1003 14:12:33.038741 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:33 crc kubenswrapper[4636]: I1003 14:12:33.038816 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:33 crc kubenswrapper[4636]: I1003 14:12:33.038934 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mj5z\" (UniqueName: \"kubernetes.io/projected/b557fea4-b20c-4f54-88af-89ed7d755cda-kube-api-access-8mj5z\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:33 crc kubenswrapper[4636]: I1003 14:12:33.139906 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:33 crc kubenswrapper[4636]: I1003 14:12:33.139981 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mj5z\" (UniqueName: \"kubernetes.io/projected/b557fea4-b20c-4f54-88af-89ed7d755cda-kube-api-access-8mj5z\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:33 crc kubenswrapper[4636]: I1003 14:12:33.140025 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:33 crc kubenswrapper[4636]: I1003 14:12:33.140609 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:33 crc kubenswrapper[4636]: I1003 14:12:33.140643 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:33 crc kubenswrapper[4636]: I1003 14:12:33.161093 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mj5z\" (UniqueName: \"kubernetes.io/projected/b557fea4-b20c-4f54-88af-89ed7d755cda-kube-api-access-8mj5z\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:33 crc kubenswrapper[4636]: I1003 14:12:33.268967 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:33 crc kubenswrapper[4636]: I1003 14:12:33.653022 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr"] Oct 03 14:12:34 crc kubenswrapper[4636]: I1003 14:12:34.238208 4636 generic.go:334] "Generic (PLEG): container finished" podID="b557fea4-b20c-4f54-88af-89ed7d755cda" containerID="dca7e95c9210ea6c7fec1f7ef64cb79b86ea6391abdccf9c7902d777fe039624" exitCode=0 Oct 03 14:12:34 crc kubenswrapper[4636]: I1003 14:12:34.238257 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" event={"ID":"b557fea4-b20c-4f54-88af-89ed7d755cda","Type":"ContainerDied","Data":"dca7e95c9210ea6c7fec1f7ef64cb79b86ea6391abdccf9c7902d777fe039624"} Oct 03 14:12:34 crc kubenswrapper[4636]: I1003 14:12:34.239222 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" event={"ID":"b557fea4-b20c-4f54-88af-89ed7d755cda","Type":"ContainerStarted","Data":"97a43273f4bd48845e54dd62154fc77168716c80251b6df4b1b2fef2b394794a"} Oct 03 14:12:36 crc kubenswrapper[4636]: I1003 14:12:36.250493 4636 generic.go:334] "Generic (PLEG): container finished" podID="b557fea4-b20c-4f54-88af-89ed7d755cda" containerID="bbdee8c1eff57154c57aea5b2f06b52e9783c52541b4fca5746a24efa2364774" exitCode=0 Oct 03 14:12:36 crc kubenswrapper[4636]: I1003 14:12:36.250580 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" event={"ID":"b557fea4-b20c-4f54-88af-89ed7d755cda","Type":"ContainerDied","Data":"bbdee8c1eff57154c57aea5b2f06b52e9783c52541b4fca5746a24efa2364774"} Oct 03 14:12:37 crc kubenswrapper[4636]: I1003 14:12:37.258376 4636 generic.go:334] "Generic (PLEG): container finished" podID="b557fea4-b20c-4f54-88af-89ed7d755cda" containerID="063cd2759e9041304b2a2fde5babe1a06b21e41e60c7fb2d27056c632a951f77" exitCode=0 Oct 03 14:12:37 crc kubenswrapper[4636]: I1003 14:12:37.259183 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" event={"ID":"b557fea4-b20c-4f54-88af-89ed7d755cda","Type":"ContainerDied","Data":"063cd2759e9041304b2a2fde5babe1a06b21e41e60c7fb2d27056c632a951f77"} Oct 03 14:12:38 crc kubenswrapper[4636]: I1003 14:12:38.459578 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:38 crc kubenswrapper[4636]: I1003 14:12:38.618425 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-util\") pod \"b557fea4-b20c-4f54-88af-89ed7d755cda\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " Oct 03 14:12:38 crc kubenswrapper[4636]: I1003 14:12:38.618536 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mj5z\" (UniqueName: \"kubernetes.io/projected/b557fea4-b20c-4f54-88af-89ed7d755cda-kube-api-access-8mj5z\") pod \"b557fea4-b20c-4f54-88af-89ed7d755cda\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " Oct 03 14:12:38 crc kubenswrapper[4636]: I1003 14:12:38.618578 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-bundle\") pod \"b557fea4-b20c-4f54-88af-89ed7d755cda\" (UID: \"b557fea4-b20c-4f54-88af-89ed7d755cda\") " Oct 03 14:12:38 crc kubenswrapper[4636]: I1003 14:12:38.619236 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-bundle" (OuterVolumeSpecName: "bundle") pod "b557fea4-b20c-4f54-88af-89ed7d755cda" (UID: "b557fea4-b20c-4f54-88af-89ed7d755cda"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:12:38 crc kubenswrapper[4636]: I1003 14:12:38.623940 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b557fea4-b20c-4f54-88af-89ed7d755cda-kube-api-access-8mj5z" (OuterVolumeSpecName: "kube-api-access-8mj5z") pod "b557fea4-b20c-4f54-88af-89ed7d755cda" (UID: "b557fea4-b20c-4f54-88af-89ed7d755cda"). InnerVolumeSpecName "kube-api-access-8mj5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:12:38 crc kubenswrapper[4636]: I1003 14:12:38.632465 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-util" (OuterVolumeSpecName: "util") pod "b557fea4-b20c-4f54-88af-89ed7d755cda" (UID: "b557fea4-b20c-4f54-88af-89ed7d755cda"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:12:38 crc kubenswrapper[4636]: I1003 14:12:38.719835 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mj5z\" (UniqueName: \"kubernetes.io/projected/b557fea4-b20c-4f54-88af-89ed7d755cda-kube-api-access-8mj5z\") on node \"crc\" DevicePath \"\"" Oct 03 14:12:38 crc kubenswrapper[4636]: I1003 14:12:38.719882 4636 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:12:38 crc kubenswrapper[4636]: I1003 14:12:38.719893 4636 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b557fea4-b20c-4f54-88af-89ed7d755cda-util\") on node \"crc\" DevicePath \"\"" Oct 03 14:12:39 crc kubenswrapper[4636]: I1003 14:12:39.270526 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" event={"ID":"b557fea4-b20c-4f54-88af-89ed7d755cda","Type":"ContainerDied","Data":"97a43273f4bd48845e54dd62154fc77168716c80251b6df4b1b2fef2b394794a"} Oct 03 14:12:39 crc kubenswrapper[4636]: I1003 14:12:39.270817 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97a43273f4bd48845e54dd62154fc77168716c80251b6df4b1b2fef2b394794a" Oct 03 14:12:39 crc kubenswrapper[4636]: I1003 14:12:39.270652 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.383597 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-7zh6n"] Oct 03 14:12:40 crc kubenswrapper[4636]: E1003 14:12:40.384062 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b557fea4-b20c-4f54-88af-89ed7d755cda" containerName="util" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.384073 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b557fea4-b20c-4f54-88af-89ed7d755cda" containerName="util" Oct 03 14:12:40 crc kubenswrapper[4636]: E1003 14:12:40.384090 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b557fea4-b20c-4f54-88af-89ed7d755cda" containerName="pull" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.384111 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b557fea4-b20c-4f54-88af-89ed7d755cda" containerName="pull" Oct 03 14:12:40 crc kubenswrapper[4636]: E1003 14:12:40.384119 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b557fea4-b20c-4f54-88af-89ed7d755cda" containerName="extract" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.384125 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b557fea4-b20c-4f54-88af-89ed7d755cda" containerName="extract" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.384209 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="b557fea4-b20c-4f54-88af-89ed7d755cda" containerName="extract" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.384575 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-7zh6n" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.386468 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.387465 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.387700 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-6bc69" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.427170 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-7zh6n"] Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.546134 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcxbf\" (UniqueName: \"kubernetes.io/projected/a043488f-1ceb-4faa-a72a-76172cf550f7-kube-api-access-tcxbf\") pod \"nmstate-operator-858ddd8f98-7zh6n\" (UID: \"a043488f-1ceb-4faa-a72a-76172cf550f7\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-7zh6n" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.647279 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcxbf\" (UniqueName: \"kubernetes.io/projected/a043488f-1ceb-4faa-a72a-76172cf550f7-kube-api-access-tcxbf\") pod \"nmstate-operator-858ddd8f98-7zh6n\" (UID: \"a043488f-1ceb-4faa-a72a-76172cf550f7\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-7zh6n" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.664166 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcxbf\" (UniqueName: \"kubernetes.io/projected/a043488f-1ceb-4faa-a72a-76172cf550f7-kube-api-access-tcxbf\") pod \"nmstate-operator-858ddd8f98-7zh6n\" (UID: \"a043488f-1ceb-4faa-a72a-76172cf550f7\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-7zh6n" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.698949 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-7zh6n" Oct 03 14:12:40 crc kubenswrapper[4636]: I1003 14:12:40.873223 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-7zh6n"] Oct 03 14:12:41 crc kubenswrapper[4636]: I1003 14:12:41.281250 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-7zh6n" event={"ID":"a043488f-1ceb-4faa-a72a-76172cf550f7","Type":"ContainerStarted","Data":"e70d979db3853ef3e708659895f14196d2c8bc059a8f6d8ee9bf1693ba4711f6"} Oct 03 14:12:44 crc kubenswrapper[4636]: I1003 14:12:44.299130 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-7zh6n" event={"ID":"a043488f-1ceb-4faa-a72a-76172cf550f7","Type":"ContainerStarted","Data":"205bf02aac476e53e54cbb246b1e9de1b8ab5b488de428c4cf5b543ca20e11d9"} Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.193956 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-7zh6n" podStartSLOduration=2.314820707 podStartE2EDuration="5.193940203s" podCreationTimestamp="2025-10-03 14:12:40 +0000 UTC" firstStartedPulling="2025-10-03 14:12:40.879418851 +0000 UTC m=+710.738145098" lastFinishedPulling="2025-10-03 14:12:43.758538347 +0000 UTC m=+713.617264594" observedRunningTime="2025-10-03 14:12:44.316013099 +0000 UTC m=+714.174739366" watchObservedRunningTime="2025-10-03 14:12:45.193940203 +0000 UTC m=+715.052666440" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.196289 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m"] Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.197303 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.199853 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-7l8hs" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.208058 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb"] Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.208804 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.220043 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.228177 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m"] Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.234671 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb"] Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.262319 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mtj6z"] Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.263153 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.326487 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dc21bb9c-24c3-4267-ab8d-96ed8e255c69-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-46ppb\" (UID: \"dc21bb9c-24c3-4267-ab8d-96ed8e255c69\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.326547 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pbg7\" (UniqueName: \"kubernetes.io/projected/dc21bb9c-24c3-4267-ab8d-96ed8e255c69-kube-api-access-9pbg7\") pod \"nmstate-webhook-6cdbc54649-46ppb\" (UID: \"dc21bb9c-24c3-4267-ab8d-96ed8e255c69\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.326620 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnf57\" (UniqueName: \"kubernetes.io/projected/438131cc-c24c-40a2-b874-8d1dca095f61-kube-api-access-vnf57\") pod \"nmstate-metrics-fdff9cb8d-vbc2m\" (UID: \"438131cc-c24c-40a2-b874-8d1dca095f61\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.427185 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnf57\" (UniqueName: \"kubernetes.io/projected/438131cc-c24c-40a2-b874-8d1dca095f61-kube-api-access-vnf57\") pod \"nmstate-metrics-fdff9cb8d-vbc2m\" (UID: \"438131cc-c24c-40a2-b874-8d1dca095f61\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.427225 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89380ab9-db32-4562-aec2-69a9f3c703b6-nmstate-lock\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.427251 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dc21bb9c-24c3-4267-ab8d-96ed8e255c69-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-46ppb\" (UID: \"dc21bb9c-24c3-4267-ab8d-96ed8e255c69\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.427291 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pbg7\" (UniqueName: \"kubernetes.io/projected/dc21bb9c-24c3-4267-ab8d-96ed8e255c69-kube-api-access-9pbg7\") pod \"nmstate-webhook-6cdbc54649-46ppb\" (UID: \"dc21bb9c-24c3-4267-ab8d-96ed8e255c69\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.427333 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfgd5\" (UniqueName: \"kubernetes.io/projected/89380ab9-db32-4562-aec2-69a9f3c703b6-kube-api-access-lfgd5\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.427357 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89380ab9-db32-4562-aec2-69a9f3c703b6-ovs-socket\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.427382 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89380ab9-db32-4562-aec2-69a9f3c703b6-dbus-socket\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: E1003 14:12:45.427474 4636 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 03 14:12:45 crc kubenswrapper[4636]: E1003 14:12:45.427512 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc21bb9c-24c3-4267-ab8d-96ed8e255c69-tls-key-pair podName:dc21bb9c-24c3-4267-ab8d-96ed8e255c69 nodeName:}" failed. No retries permitted until 2025-10-03 14:12:45.927497965 +0000 UTC m=+715.786224212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/dc21bb9c-24c3-4267-ab8d-96ed8e255c69-tls-key-pair") pod "nmstate-webhook-6cdbc54649-46ppb" (UID: "dc21bb9c-24c3-4267-ab8d-96ed8e255c69") : secret "openshift-nmstate-webhook" not found Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.452857 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnf57\" (UniqueName: \"kubernetes.io/projected/438131cc-c24c-40a2-b874-8d1dca095f61-kube-api-access-vnf57\") pod \"nmstate-metrics-fdff9cb8d-vbc2m\" (UID: \"438131cc-c24c-40a2-b874-8d1dca095f61\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.452922 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pbg7\" (UniqueName: \"kubernetes.io/projected/dc21bb9c-24c3-4267-ab8d-96ed8e255c69-kube-api-access-9pbg7\") pod \"nmstate-webhook-6cdbc54649-46ppb\" (UID: \"dc21bb9c-24c3-4267-ab8d-96ed8e255c69\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.462265 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8"] Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.463032 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.467026 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.467041 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.467522 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-7mrbp" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.485420 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8"] Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.515100 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.528741 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89380ab9-db32-4562-aec2-69a9f3c703b6-dbus-socket\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.528794 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89380ab9-db32-4562-aec2-69a9f3c703b6-nmstate-lock\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.528885 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfgd5\" (UniqueName: \"kubernetes.io/projected/89380ab9-db32-4562-aec2-69a9f3c703b6-kube-api-access-lfgd5\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.530082 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89380ab9-db32-4562-aec2-69a9f3c703b6-ovs-socket\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.533296 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89380ab9-db32-4562-aec2-69a9f3c703b6-nmstate-lock\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.533397 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89380ab9-db32-4562-aec2-69a9f3c703b6-dbus-socket\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.533535 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89380ab9-db32-4562-aec2-69a9f3c703b6-ovs-socket\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.588789 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfgd5\" (UniqueName: \"kubernetes.io/projected/89380ab9-db32-4562-aec2-69a9f3c703b6-kube-api-access-lfgd5\") pod \"nmstate-handler-mtj6z\" (UID: \"89380ab9-db32-4562-aec2-69a9f3c703b6\") " pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.632279 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bc7eb6e-0aa6-44a5-914e-7f3a97421f50-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-zztw8\" (UID: \"7bc7eb6e-0aa6-44a5-914e-7f3a97421f50\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.632336 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7bc7eb6e-0aa6-44a5-914e-7f3a97421f50-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-zztw8\" (UID: \"7bc7eb6e-0aa6-44a5-914e-7f3a97421f50\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.632645 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpckx\" (UniqueName: \"kubernetes.io/projected/7bc7eb6e-0aa6-44a5-914e-7f3a97421f50-kube-api-access-vpckx\") pod \"nmstate-console-plugin-6b874cbd85-zztw8\" (UID: \"7bc7eb6e-0aa6-44a5-914e-7f3a97421f50\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.675600 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7ddf657454-4zb7x"] Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.676973 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.705200 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7ddf657454-4zb7x"] Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.736200 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bc7eb6e-0aa6-44a5-914e-7f3a97421f50-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-zztw8\" (UID: \"7bc7eb6e-0aa6-44a5-914e-7f3a97421f50\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.736242 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7bc7eb6e-0aa6-44a5-914e-7f3a97421f50-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-zztw8\" (UID: \"7bc7eb6e-0aa6-44a5-914e-7f3a97421f50\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.736298 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpckx\" (UniqueName: \"kubernetes.io/projected/7bc7eb6e-0aa6-44a5-914e-7f3a97421f50-kube-api-access-vpckx\") pod \"nmstate-console-plugin-6b874cbd85-zztw8\" (UID: \"7bc7eb6e-0aa6-44a5-914e-7f3a97421f50\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.738027 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7bc7eb6e-0aa6-44a5-914e-7f3a97421f50-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-zztw8\" (UID: \"7bc7eb6e-0aa6-44a5-914e-7f3a97421f50\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.758552 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpckx\" (UniqueName: \"kubernetes.io/projected/7bc7eb6e-0aa6-44a5-914e-7f3a97421f50-kube-api-access-vpckx\") pod \"nmstate-console-plugin-6b874cbd85-zztw8\" (UID: \"7bc7eb6e-0aa6-44a5-914e-7f3a97421f50\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.760557 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bc7eb6e-0aa6-44a5-914e-7f3a97421f50-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-zztw8\" (UID: \"7bc7eb6e-0aa6-44a5-914e-7f3a97421f50\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.795318 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.807513 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m"] Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.837423 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3b00fd86-7851-4ce1-9e47-a96028212a0a-console-oauth-config\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.837479 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-service-ca\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.837505 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-oauth-serving-cert\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.837535 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vmz9\" (UniqueName: \"kubernetes.io/projected/3b00fd86-7851-4ce1-9e47-a96028212a0a-kube-api-access-6vmz9\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.837574 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b00fd86-7851-4ce1-9e47-a96028212a0a-console-serving-cert\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.837709 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-trusted-ca-bundle\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.837795 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-console-config\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.877771 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:45 crc kubenswrapper[4636]: W1003 14:12:45.926977 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89380ab9_db32_4562_aec2_69a9f3c703b6.slice/crio-1de99d7884330bcb64829cfe3d5aea5c5239e4c8785cff0391a83b8b57835ab6 WatchSource:0}: Error finding container 1de99d7884330bcb64829cfe3d5aea5c5239e4c8785cff0391a83b8b57835ab6: Status 404 returned error can't find the container with id 1de99d7884330bcb64829cfe3d5aea5c5239e4c8785cff0391a83b8b57835ab6 Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.938478 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-console-config\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.938538 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3b00fd86-7851-4ce1-9e47-a96028212a0a-console-oauth-config\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.938554 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-service-ca\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.938568 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-oauth-serving-cert\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.938588 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vmz9\" (UniqueName: \"kubernetes.io/projected/3b00fd86-7851-4ce1-9e47-a96028212a0a-kube-api-access-6vmz9\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.938614 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b00fd86-7851-4ce1-9e47-a96028212a0a-console-serving-cert\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.938640 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dc21bb9c-24c3-4267-ab8d-96ed8e255c69-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-46ppb\" (UID: \"dc21bb9c-24c3-4267-ab8d-96ed8e255c69\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.938665 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-trusted-ca-bundle\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.939777 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-oauth-serving-cert\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.939808 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-console-config\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.940185 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-trusted-ca-bundle\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.940764 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b00fd86-7851-4ce1-9e47-a96028212a0a-service-ca\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.947027 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b00fd86-7851-4ce1-9e47-a96028212a0a-console-serving-cert\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.949823 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dc21bb9c-24c3-4267-ab8d-96ed8e255c69-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-46ppb\" (UID: \"dc21bb9c-24c3-4267-ab8d-96ed8e255c69\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.953302 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3b00fd86-7851-4ce1-9e47-a96028212a0a-console-oauth-config\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.957354 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vmz9\" (UniqueName: \"kubernetes.io/projected/3b00fd86-7851-4ce1-9e47-a96028212a0a-kube-api-access-6vmz9\") pod \"console-7ddf657454-4zb7x\" (UID: \"3b00fd86-7851-4ce1-9e47-a96028212a0a\") " pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:45 crc kubenswrapper[4636]: I1003 14:12:45.998647 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8"] Oct 03 14:12:46 crc kubenswrapper[4636]: I1003 14:12:46.014912 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:46 crc kubenswrapper[4636]: I1003 14:12:46.121186 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" Oct 03 14:12:46 crc kubenswrapper[4636]: I1003 14:12:46.225542 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7ddf657454-4zb7x"] Oct 03 14:12:46 crc kubenswrapper[4636]: W1003 14:12:46.232819 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b00fd86_7851_4ce1_9e47_a96028212a0a.slice/crio-ba8954ebd8aa08cad3c695b3741d750c735b8edefffb1c08eaa838f73920f629 WatchSource:0}: Error finding container ba8954ebd8aa08cad3c695b3741d750c735b8edefffb1c08eaa838f73920f629: Status 404 returned error can't find the container with id ba8954ebd8aa08cad3c695b3741d750c735b8edefffb1c08eaa838f73920f629 Oct 03 14:12:46 crc kubenswrapper[4636]: I1003 14:12:46.310417 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m" event={"ID":"438131cc-c24c-40a2-b874-8d1dca095f61","Type":"ContainerStarted","Data":"caf83f993766859204e09640087177634b5c386f183e445164b865cfed3836ae"} Oct 03 14:12:46 crc kubenswrapper[4636]: I1003 14:12:46.311397 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" event={"ID":"7bc7eb6e-0aa6-44a5-914e-7f3a97421f50","Type":"ContainerStarted","Data":"6c06bf28bb65d975992d59d7efe0da61a0994f1951b7b6142d9b8b910ad19090"} Oct 03 14:12:46 crc kubenswrapper[4636]: I1003 14:12:46.312337 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ddf657454-4zb7x" event={"ID":"3b00fd86-7851-4ce1-9e47-a96028212a0a","Type":"ContainerStarted","Data":"ba8954ebd8aa08cad3c695b3741d750c735b8edefffb1c08eaa838f73920f629"} Oct 03 14:12:46 crc kubenswrapper[4636]: I1003 14:12:46.313183 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mtj6z" event={"ID":"89380ab9-db32-4562-aec2-69a9f3c703b6","Type":"ContainerStarted","Data":"1de99d7884330bcb64829cfe3d5aea5c5239e4c8785cff0391a83b8b57835ab6"} Oct 03 14:12:46 crc kubenswrapper[4636]: I1003 14:12:46.523909 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb"] Oct 03 14:12:47 crc kubenswrapper[4636]: I1003 14:12:47.320362 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" event={"ID":"dc21bb9c-24c3-4267-ab8d-96ed8e255c69","Type":"ContainerStarted","Data":"5712ccfb74301d750c2ebb361a9adfe040e2aac2fd2642cbbbd0e8f37429ed75"} Oct 03 14:12:47 crc kubenswrapper[4636]: I1003 14:12:47.321581 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ddf657454-4zb7x" event={"ID":"3b00fd86-7851-4ce1-9e47-a96028212a0a","Type":"ContainerStarted","Data":"b0c19b0a691d30da9c02e5c1eedc6755e88c4de502f855fcb473fb5d70b287f3"} Oct 03 14:12:47 crc kubenswrapper[4636]: I1003 14:12:47.342946 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7ddf657454-4zb7x" podStartSLOduration=2.342927318 podStartE2EDuration="2.342927318s" podCreationTimestamp="2025-10-03 14:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:12:47.341018629 +0000 UTC m=+717.199744896" watchObservedRunningTime="2025-10-03 14:12:47.342927318 +0000 UTC m=+717.201653565" Oct 03 14:12:49 crc kubenswrapper[4636]: I1003 14:12:49.333181 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m" event={"ID":"438131cc-c24c-40a2-b874-8d1dca095f61","Type":"ContainerStarted","Data":"dc9031802541e6328f10914af106b38f7670843084baf46bc93877b0c699e0e1"} Oct 03 14:12:49 crc kubenswrapper[4636]: I1003 14:12:49.335394 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" event={"ID":"7bc7eb6e-0aa6-44a5-914e-7f3a97421f50","Type":"ContainerStarted","Data":"b9c79ce5ca332ec228b505843dd9f4084c978e01126ad2f5111c6fd27408e13d"} Oct 03 14:12:49 crc kubenswrapper[4636]: I1003 14:12:49.337540 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" event={"ID":"dc21bb9c-24c3-4267-ab8d-96ed8e255c69","Type":"ContainerStarted","Data":"98acebf4185cf8ee4e4346305831cff2b1023fcf248daf6aeb2d8aa2858f20d3"} Oct 03 14:12:49 crc kubenswrapper[4636]: I1003 14:12:49.337950 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" Oct 03 14:12:49 crc kubenswrapper[4636]: I1003 14:12:49.352576 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-zztw8" podStartSLOduration=1.4281607680000001 podStartE2EDuration="4.352548137s" podCreationTimestamp="2025-10-03 14:12:45 +0000 UTC" firstStartedPulling="2025-10-03 14:12:46.006746866 +0000 UTC m=+715.865473113" lastFinishedPulling="2025-10-03 14:12:48.931134235 +0000 UTC m=+718.789860482" observedRunningTime="2025-10-03 14:12:49.348118233 +0000 UTC m=+719.206844480" watchObservedRunningTime="2025-10-03 14:12:49.352548137 +0000 UTC m=+719.211274394" Oct 03 14:12:49 crc kubenswrapper[4636]: I1003 14:12:49.380419 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" podStartSLOduration=1.820511175 podStartE2EDuration="4.380399291s" podCreationTimestamp="2025-10-03 14:12:45 +0000 UTC" firstStartedPulling="2025-10-03 14:12:46.533289926 +0000 UTC m=+716.392016173" lastFinishedPulling="2025-10-03 14:12:49.093178042 +0000 UTC m=+718.951904289" observedRunningTime="2025-10-03 14:12:49.377076616 +0000 UTC m=+719.235802883" watchObservedRunningTime="2025-10-03 14:12:49.380399291 +0000 UTC m=+719.239125538" Oct 03 14:12:50 crc kubenswrapper[4636]: I1003 14:12:50.345805 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mtj6z" event={"ID":"89380ab9-db32-4562-aec2-69a9f3c703b6","Type":"ContainerStarted","Data":"6d52403e9370338e5c4c8625737acbf4aed3415b72ea09ffaf356afb790ec42a"} Oct 03 14:12:50 crc kubenswrapper[4636]: I1003 14:12:50.362546 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mtj6z" podStartSLOduration=2.263522561 podStartE2EDuration="5.362496379s" podCreationTimestamp="2025-10-03 14:12:45 +0000 UTC" firstStartedPulling="2025-10-03 14:12:45.929235988 +0000 UTC m=+715.787962235" lastFinishedPulling="2025-10-03 14:12:49.028209806 +0000 UTC m=+718.886936053" observedRunningTime="2025-10-03 14:12:50.359181724 +0000 UTC m=+720.217907971" watchObservedRunningTime="2025-10-03 14:12:50.362496379 +0000 UTC m=+720.221222626" Oct 03 14:12:50 crc kubenswrapper[4636]: I1003 14:12:50.878188 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:52 crc kubenswrapper[4636]: I1003 14:12:52.362862 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m" event={"ID":"438131cc-c24c-40a2-b874-8d1dca095f61","Type":"ContainerStarted","Data":"2e8e44f5f3a80973180b1237cfb6f3e6739e27a13022c4db5dfab1497aa8e29b"} Oct 03 14:12:52 crc kubenswrapper[4636]: I1003 14:12:52.395723 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vbc2m" podStartSLOduration=1.967495076 podStartE2EDuration="7.395698113s" podCreationTimestamp="2025-10-03 14:12:45 +0000 UTC" firstStartedPulling="2025-10-03 14:12:45.822679024 +0000 UTC m=+715.681405271" lastFinishedPulling="2025-10-03 14:12:51.250882061 +0000 UTC m=+721.109608308" observedRunningTime="2025-10-03 14:12:52.37882955 +0000 UTC m=+722.237555797" watchObservedRunningTime="2025-10-03 14:12:52.395698113 +0000 UTC m=+722.254424380" Oct 03 14:12:55 crc kubenswrapper[4636]: I1003 14:12:55.901766 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mtj6z" Oct 03 14:12:56 crc kubenswrapper[4636]: I1003 14:12:56.015736 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:56 crc kubenswrapper[4636]: I1003 14:12:56.016034 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:56 crc kubenswrapper[4636]: I1003 14:12:56.020122 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:56 crc kubenswrapper[4636]: I1003 14:12:56.387654 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7ddf657454-4zb7x" Oct 03 14:12:56 crc kubenswrapper[4636]: I1003 14:12:56.456433 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lqxss"] Oct 03 14:13:06 crc kubenswrapper[4636]: I1003 14:13:06.127755 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-46ppb" Oct 03 14:13:09 crc kubenswrapper[4636]: I1003 14:13:09.162732 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:13:09 crc kubenswrapper[4636]: I1003 14:13:09.164116 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:13:17 crc kubenswrapper[4636]: I1003 14:13:17.876117 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj"] Oct 03 14:13:17 crc kubenswrapper[4636]: I1003 14:13:17.877896 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:17 crc kubenswrapper[4636]: I1003 14:13:17.879967 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 14:13:17 crc kubenswrapper[4636]: I1003 14:13:17.904355 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj"] Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.052798 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbgd\" (UniqueName: \"kubernetes.io/projected/a83000c5-7baa-4587-980d-90391869a32c-kube-api-access-pdbgd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.052861 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.052893 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.154253 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbgd\" (UniqueName: \"kubernetes.io/projected/a83000c5-7baa-4587-980d-90391869a32c-kube-api-access-pdbgd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.154317 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.154344 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.154809 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.154850 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.172979 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbgd\" (UniqueName: \"kubernetes.io/projected/a83000c5-7baa-4587-980d-90391869a32c-kube-api-access-pdbgd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.196847 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.387534 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj"] Oct 03 14:13:18 crc kubenswrapper[4636]: I1003 14:13:18.495208 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" event={"ID":"a83000c5-7baa-4587-980d-90391869a32c","Type":"ContainerStarted","Data":"66a3197290d51d1bfb50dc8c852a83355f65a32f76ca9cf405c55a31eaaaeee4"} Oct 03 14:13:18 crc kubenswrapper[4636]: E1003 14:13:18.700236 4636 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda83000c5_7baa_4587_980d_90391869a32c.slice/crio-3d8d1cef27373561267034b7eff0e7367d27ddaddf5d1aec2e2f86ec786abd0c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda83000c5_7baa_4587_980d_90391869a32c.slice/crio-conmon-3d8d1cef27373561267034b7eff0e7367d27ddaddf5d1aec2e2f86ec786abd0c.scope\": RecentStats: unable to find data in memory cache]" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.158784 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mz2wd"] Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.159305 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" podUID="6b8981cc-75fe-4ecd-971f-f01c74e8fd74" containerName="controller-manager" containerID="cri-o://3590e607bb939f0376f5f7907e81903fece6dc0ceaa6146bebd6e56508b19c14" gracePeriod=30 Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.255883 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4"] Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.256130 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" podUID="8c513f61-cee7-451f-b8a9-1dab425641a8" containerName="route-controller-manager" containerID="cri-o://7191bd6becaabd21136e7d6a55aa2e68dc1b4cdf91a4e59ae500408128f0522d" gracePeriod=30 Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.505172 4636 generic.go:334] "Generic (PLEG): container finished" podID="a83000c5-7baa-4587-980d-90391869a32c" containerID="3d8d1cef27373561267034b7eff0e7367d27ddaddf5d1aec2e2f86ec786abd0c" exitCode=0 Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.505231 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" event={"ID":"a83000c5-7baa-4587-980d-90391869a32c","Type":"ContainerDied","Data":"3d8d1cef27373561267034b7eff0e7367d27ddaddf5d1aec2e2f86ec786abd0c"} Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.510054 4636 generic.go:334] "Generic (PLEG): container finished" podID="6b8981cc-75fe-4ecd-971f-f01c74e8fd74" containerID="3590e607bb939f0376f5f7907e81903fece6dc0ceaa6146bebd6e56508b19c14" exitCode=0 Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.510127 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" event={"ID":"6b8981cc-75fe-4ecd-971f-f01c74e8fd74","Type":"ContainerDied","Data":"3590e607bb939f0376f5f7907e81903fece6dc0ceaa6146bebd6e56508b19c14"} Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.510154 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" event={"ID":"6b8981cc-75fe-4ecd-971f-f01c74e8fd74","Type":"ContainerDied","Data":"5b131f35ee3718c726e206599a0cde558e947f1a8eeafea21f4e6bc75913ccbd"} Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.510166 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b131f35ee3718c726e206599a0cde558e947f1a8eeafea21f4e6bc75913ccbd" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.517499 4636 generic.go:334] "Generic (PLEG): container finished" podID="8c513f61-cee7-451f-b8a9-1dab425641a8" containerID="7191bd6becaabd21136e7d6a55aa2e68dc1b4cdf91a4e59ae500408128f0522d" exitCode=0 Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.517556 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" event={"ID":"8c513f61-cee7-451f-b8a9-1dab425641a8","Type":"ContainerDied","Data":"7191bd6becaabd21136e7d6a55aa2e68dc1b4cdf91a4e59ae500408128f0522d"} Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.548557 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.583424 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-config\") pod \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.583515 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b5dg\" (UniqueName: \"kubernetes.io/projected/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-kube-api-access-8b5dg\") pod \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.583565 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-client-ca\") pod \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.583585 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-proxy-ca-bundles\") pod \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.583631 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-serving-cert\") pod \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\" (UID: \"6b8981cc-75fe-4ecd-971f-f01c74e8fd74\") " Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.584311 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-client-ca" (OuterVolumeSpecName: "client-ca") pod "6b8981cc-75fe-4ecd-971f-f01c74e8fd74" (UID: "6b8981cc-75fe-4ecd-971f-f01c74e8fd74"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.584383 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-config" (OuterVolumeSpecName: "config") pod "6b8981cc-75fe-4ecd-971f-f01c74e8fd74" (UID: "6b8981cc-75fe-4ecd-971f-f01c74e8fd74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.584641 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6b8981cc-75fe-4ecd-971f-f01c74e8fd74" (UID: "6b8981cc-75fe-4ecd-971f-f01c74e8fd74"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.589395 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-kube-api-access-8b5dg" (OuterVolumeSpecName: "kube-api-access-8b5dg") pod "6b8981cc-75fe-4ecd-971f-f01c74e8fd74" (UID: "6b8981cc-75fe-4ecd-971f-f01c74e8fd74"). InnerVolumeSpecName "kube-api-access-8b5dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.590017 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6b8981cc-75fe-4ecd-971f-f01c74e8fd74" (UID: "6b8981cc-75fe-4ecd-971f-f01c74e8fd74"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.612151 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.684612 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-config\") pod \"8c513f61-cee7-451f-b8a9-1dab425641a8\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.685074 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfzzt\" (UniqueName: \"kubernetes.io/projected/8c513f61-cee7-451f-b8a9-1dab425641a8-kube-api-access-mfzzt\") pod \"8c513f61-cee7-451f-b8a9-1dab425641a8\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.685136 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-client-ca\") pod \"8c513f61-cee7-451f-b8a9-1dab425641a8\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.685159 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c513f61-cee7-451f-b8a9-1dab425641a8-serving-cert\") pod \"8c513f61-cee7-451f-b8a9-1dab425641a8\" (UID: \"8c513f61-cee7-451f-b8a9-1dab425641a8\") " Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.685370 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.685390 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b5dg\" (UniqueName: \"kubernetes.io/projected/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-kube-api-access-8b5dg\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.685402 4636 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.685412 4636 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.685423 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b8981cc-75fe-4ecd-971f-f01c74e8fd74-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.685866 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-config" (OuterVolumeSpecName: "config") pod "8c513f61-cee7-451f-b8a9-1dab425641a8" (UID: "8c513f61-cee7-451f-b8a9-1dab425641a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.685880 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-client-ca" (OuterVolumeSpecName: "client-ca") pod "8c513f61-cee7-451f-b8a9-1dab425641a8" (UID: "8c513f61-cee7-451f-b8a9-1dab425641a8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.688536 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c513f61-cee7-451f-b8a9-1dab425641a8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8c513f61-cee7-451f-b8a9-1dab425641a8" (UID: "8c513f61-cee7-451f-b8a9-1dab425641a8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.688864 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c513f61-cee7-451f-b8a9-1dab425641a8-kube-api-access-mfzzt" (OuterVolumeSpecName: "kube-api-access-mfzzt") pod "8c513f61-cee7-451f-b8a9-1dab425641a8" (UID: "8c513f61-cee7-451f-b8a9-1dab425641a8"). InnerVolumeSpecName "kube-api-access-mfzzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.786902 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfzzt\" (UniqueName: \"kubernetes.io/projected/8c513f61-cee7-451f-b8a9-1dab425641a8-kube-api-access-mfzzt\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.786936 4636 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.786945 4636 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c513f61-cee7-451f-b8a9-1dab425641a8-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:19 crc kubenswrapper[4636]: I1003 14:13:19.786956 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c513f61-cee7-451f-b8a9-1dab425641a8-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.524682 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mz2wd" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.529336 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.529692 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4" event={"ID":"8c513f61-cee7-451f-b8a9-1dab425641a8","Type":"ContainerDied","Data":"415a0d2a388e77176ff971efc40a15cfc0cd2cca6420979fb85425d2d981e630"} Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.529755 4636 scope.go:117] "RemoveContainer" containerID="7191bd6becaabd21136e7d6a55aa2e68dc1b4cdf91a4e59ae500408128f0522d" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.570238 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv"] Oct 03 14:13:20 crc kubenswrapper[4636]: E1003 14:13:20.570475 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c513f61-cee7-451f-b8a9-1dab425641a8" containerName="route-controller-manager" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.570490 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c513f61-cee7-451f-b8a9-1dab425641a8" containerName="route-controller-manager" Oct 03 14:13:20 crc kubenswrapper[4636]: E1003 14:13:20.570505 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8981cc-75fe-4ecd-971f-f01c74e8fd74" containerName="controller-manager" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.570512 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8981cc-75fe-4ecd-971f-f01c74e8fd74" containerName="controller-manager" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.570618 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8981cc-75fe-4ecd-971f-f01c74e8fd74" containerName="controller-manager" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.570627 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c513f61-cee7-451f-b8a9-1dab425641a8" containerName="route-controller-manager" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.570990 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.572510 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.572714 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.572851 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.572808 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.573092 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.574472 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.585375 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4"] Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.595426 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv"] Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.597092 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-client-ca\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.597167 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-serving-cert\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.597191 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tbtl\" (UniqueName: \"kubernetes.io/projected/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-kube-api-access-2tbtl\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.597269 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-config\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.602984 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pc4j4"] Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.608435 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mz2wd"] Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.612923 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mz2wd"] Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.698902 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-serving-cert\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.698948 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tbtl\" (UniqueName: \"kubernetes.io/projected/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-kube-api-access-2tbtl\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.699029 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-config\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.699176 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-client-ca\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.699993 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-client-ca\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.700796 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-config\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.710830 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-serving-cert\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.714729 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tbtl\" (UniqueName: \"kubernetes.io/projected/2ef24188-7dab-43a6-8fbc-22e183d1f3f1-kube-api-access-2tbtl\") pod \"route-controller-manager-7cb94845d5-qv7dv\" (UID: \"2ef24188-7dab-43a6-8fbc-22e183d1f3f1\") " pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.802842 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8981cc-75fe-4ecd-971f-f01c74e8fd74" path="/var/lib/kubelet/pods/6b8981cc-75fe-4ecd-971f-f01c74e8fd74/volumes" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.803364 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c513f61-cee7-451f-b8a9-1dab425641a8" path="/var/lib/kubelet/pods/8c513f61-cee7-451f-b8a9-1dab425641a8/volumes" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.830223 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c78cc9769-7d74g"] Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.831172 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.833356 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.834796 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.834969 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.835191 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.836865 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.838883 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.840278 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c78cc9769-7d74g"] Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.843428 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.889526 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.901207 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df11b45c-05dd-4232-a9f1-5a3236816bd1-config\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.901301 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df11b45c-05dd-4232-a9f1-5a3236816bd1-proxy-ca-bundles\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.901543 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df11b45c-05dd-4232-a9f1-5a3236816bd1-serving-cert\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.901574 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jblf5\" (UniqueName: \"kubernetes.io/projected/df11b45c-05dd-4232-a9f1-5a3236816bd1-kube-api-access-jblf5\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:20 crc kubenswrapper[4636]: I1003 14:13:20.901605 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df11b45c-05dd-4232-a9f1-5a3236816bd1-client-ca\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.002795 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df11b45c-05dd-4232-a9f1-5a3236816bd1-serving-cert\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.002848 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jblf5\" (UniqueName: \"kubernetes.io/projected/df11b45c-05dd-4232-a9f1-5a3236816bd1-kube-api-access-jblf5\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.002882 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df11b45c-05dd-4232-a9f1-5a3236816bd1-client-ca\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.002926 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df11b45c-05dd-4232-a9f1-5a3236816bd1-config\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.002959 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df11b45c-05dd-4232-a9f1-5a3236816bd1-proxy-ca-bundles\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.004392 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df11b45c-05dd-4232-a9f1-5a3236816bd1-client-ca\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.006076 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df11b45c-05dd-4232-a9f1-5a3236816bd1-config\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.006159 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df11b45c-05dd-4232-a9f1-5a3236816bd1-proxy-ca-bundles\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.010173 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df11b45c-05dd-4232-a9f1-5a3236816bd1-serving-cert\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.025006 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jblf5\" (UniqueName: \"kubernetes.io/projected/df11b45c-05dd-4232-a9f1-5a3236816bd1-kube-api-access-jblf5\") pod \"controller-manager-5c78cc9769-7d74g\" (UID: \"df11b45c-05dd-4232-a9f1-5a3236816bd1\") " pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.095275 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv"] Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.170806 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.513652 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-lqxss" podUID="f6977d44-d8ff-4d40-959f-024da50c53fe" containerName="console" containerID="cri-o://f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746" gracePeriod=15 Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.532211 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" event={"ID":"2ef24188-7dab-43a6-8fbc-22e183d1f3f1","Type":"ContainerStarted","Data":"84225305d534ae1567d8380b874e541232a8d95dce3b078e9886bd80e0c19742"} Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.532261 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" event={"ID":"2ef24188-7dab-43a6-8fbc-22e183d1f3f1","Type":"ContainerStarted","Data":"5a3fd3aca5d6f5f80bc3fead7d1cc858931968c631df9df27df2bb61f5a844c6"} Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.532277 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.536809 4636 generic.go:334] "Generic (PLEG): container finished" podID="a83000c5-7baa-4587-980d-90391869a32c" containerID="dfc5f3e61b027743feb99f9d72ad65305c47223db0705e21bc3f281ddec6515e" exitCode=0 Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.536845 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" event={"ID":"a83000c5-7baa-4587-980d-90391869a32c","Type":"ContainerDied","Data":"dfc5f3e61b027743feb99f9d72ad65305c47223db0705e21bc3f281ddec6515e"} Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.551012 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" podStartSLOduration=1.550991138 podStartE2EDuration="1.550991138s" podCreationTimestamp="2025-10-03 14:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:13:21.548198577 +0000 UTC m=+751.406924834" watchObservedRunningTime="2025-10-03 14:13:21.550991138 +0000 UTC m=+751.409717385" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.573693 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c78cc9769-7d74g"] Oct 03 14:13:21 crc kubenswrapper[4636]: W1003 14:13:21.588537 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf11b45c_05dd_4232_a9f1_5a3236816bd1.slice/crio-eea9baa19678953f547a12d1bf31f2a30be92f16b567809e2d0106d65f33225f WatchSource:0}: Error finding container eea9baa19678953f547a12d1bf31f2a30be92f16b567809e2d0106d65f33225f: Status 404 returned error can't find the container with id eea9baa19678953f547a12d1bf31f2a30be92f16b567809e2d0106d65f33225f Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.674774 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cb94845d5-qv7dv" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.944559 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lqxss_f6977d44-d8ff-4d40-959f-024da50c53fe/console/0.log" Oct 03 14:13:21 crc kubenswrapper[4636]: I1003 14:13:21.945014 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.016788 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-trusted-ca-bundle\") pod \"f6977d44-d8ff-4d40-959f-024da50c53fe\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.016850 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-oauth-config\") pod \"f6977d44-d8ff-4d40-959f-024da50c53fe\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.016880 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-oauth-serving-cert\") pod \"f6977d44-d8ff-4d40-959f-024da50c53fe\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.016907 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwmjm\" (UniqueName: \"kubernetes.io/projected/f6977d44-d8ff-4d40-959f-024da50c53fe-kube-api-access-mwmjm\") pod \"f6977d44-d8ff-4d40-959f-024da50c53fe\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.016949 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-console-config\") pod \"f6977d44-d8ff-4d40-959f-024da50c53fe\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.016970 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-serving-cert\") pod \"f6977d44-d8ff-4d40-959f-024da50c53fe\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.017014 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-service-ca\") pod \"f6977d44-d8ff-4d40-959f-024da50c53fe\" (UID: \"f6977d44-d8ff-4d40-959f-024da50c53fe\") " Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.017916 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-service-ca" (OuterVolumeSpecName: "service-ca") pod "f6977d44-d8ff-4d40-959f-024da50c53fe" (UID: "f6977d44-d8ff-4d40-959f-024da50c53fe"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.018230 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f6977d44-d8ff-4d40-959f-024da50c53fe" (UID: "f6977d44-d8ff-4d40-959f-024da50c53fe"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.034024 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-console-config" (OuterVolumeSpecName: "console-config") pod "f6977d44-d8ff-4d40-959f-024da50c53fe" (UID: "f6977d44-d8ff-4d40-959f-024da50c53fe"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.034190 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f6977d44-d8ff-4d40-959f-024da50c53fe" (UID: "f6977d44-d8ff-4d40-959f-024da50c53fe"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.034268 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f6977d44-d8ff-4d40-959f-024da50c53fe" (UID: "f6977d44-d8ff-4d40-959f-024da50c53fe"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.037539 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6977d44-d8ff-4d40-959f-024da50c53fe-kube-api-access-mwmjm" (OuterVolumeSpecName: "kube-api-access-mwmjm") pod "f6977d44-d8ff-4d40-959f-024da50c53fe" (UID: "f6977d44-d8ff-4d40-959f-024da50c53fe"). InnerVolumeSpecName "kube-api-access-mwmjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.039682 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f6977d44-d8ff-4d40-959f-024da50c53fe" (UID: "f6977d44-d8ff-4d40-959f-024da50c53fe"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.118839 4636 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.118875 4636 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.118889 4636 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.118901 4636 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.118912 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwmjm\" (UniqueName: \"kubernetes.io/projected/f6977d44-d8ff-4d40-959f-024da50c53fe-kube-api-access-mwmjm\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.118923 4636 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6977d44-d8ff-4d40-959f-024da50c53fe-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.118934 4636 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6977d44-d8ff-4d40-959f-024da50c53fe-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.542746 4636 generic.go:334] "Generic (PLEG): container finished" podID="a83000c5-7baa-4587-980d-90391869a32c" containerID="2111f2b90b2bdd6a1b590893239fcf3c8debe3b8530d82820d3ee64b4c9d6ada" exitCode=0 Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.542816 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" event={"ID":"a83000c5-7baa-4587-980d-90391869a32c","Type":"ContainerDied","Data":"2111f2b90b2bdd6a1b590893239fcf3c8debe3b8530d82820d3ee64b4c9d6ada"} Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.544479 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" event={"ID":"df11b45c-05dd-4232-a9f1-5a3236816bd1","Type":"ContainerStarted","Data":"d2d8ff53ef2f8ac2e7cb0951a8425e4550a6a14ef646ad82ab5fb5b032587991"} Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.544546 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" event={"ID":"df11b45c-05dd-4232-a9f1-5a3236816bd1","Type":"ContainerStarted","Data":"eea9baa19678953f547a12d1bf31f2a30be92f16b567809e2d0106d65f33225f"} Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.546036 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lqxss_f6977d44-d8ff-4d40-959f-024da50c53fe/console/0.log" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.546208 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.546223 4636 generic.go:334] "Generic (PLEG): container finished" podID="f6977d44-d8ff-4d40-959f-024da50c53fe" containerID="f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746" exitCode=2 Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.546229 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lqxss" event={"ID":"f6977d44-d8ff-4d40-959f-024da50c53fe","Type":"ContainerDied","Data":"f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746"} Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.546280 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lqxss" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.546329 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lqxss" event={"ID":"f6977d44-d8ff-4d40-959f-024da50c53fe","Type":"ContainerDied","Data":"b2b9378e0b9e22502db48d84622387f275e08bfe44fb26b32c2339a2196d4115"} Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.546363 4636 scope.go:117] "RemoveContainer" containerID="f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.551351 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.566885 4636 scope.go:117] "RemoveContainer" containerID="f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746" Oct 03 14:13:22 crc kubenswrapper[4636]: E1003 14:13:22.567691 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746\": container with ID starting with f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746 not found: ID does not exist" containerID="f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.567718 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746"} err="failed to get container status \"f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746\": rpc error: code = NotFound desc = could not find container \"f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746\": container with ID starting with f6ec9cd648c475fd8dc49dbf775e20316cb6a87cadad0f41a4bf942ab0c24746 not found: ID does not exist" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.595867 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c78cc9769-7d74g" podStartSLOduration=3.595847966 podStartE2EDuration="3.595847966s" podCreationTimestamp="2025-10-03 14:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:13:22.593539366 +0000 UTC m=+752.452265613" watchObservedRunningTime="2025-10-03 14:13:22.595847966 +0000 UTC m=+752.454574213" Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.614997 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lqxss"] Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.621475 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-lqxss"] Oct 03 14:13:22 crc kubenswrapper[4636]: I1003 14:13:22.810502 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6977d44-d8ff-4d40-959f-024da50c53fe" path="/var/lib/kubelet/pods/f6977d44-d8ff-4d40-959f-024da50c53fe/volumes" Oct 03 14:13:23 crc kubenswrapper[4636]: I1003 14:13:23.785266 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:23 crc kubenswrapper[4636]: I1003 14:13:23.838060 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-util\") pod \"a83000c5-7baa-4587-980d-90391869a32c\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " Oct 03 14:13:23 crc kubenswrapper[4636]: I1003 14:13:23.838204 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-bundle\") pod \"a83000c5-7baa-4587-980d-90391869a32c\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " Oct 03 14:13:23 crc kubenswrapper[4636]: I1003 14:13:23.838229 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdbgd\" (UniqueName: \"kubernetes.io/projected/a83000c5-7baa-4587-980d-90391869a32c-kube-api-access-pdbgd\") pod \"a83000c5-7baa-4587-980d-90391869a32c\" (UID: \"a83000c5-7baa-4587-980d-90391869a32c\") " Oct 03 14:13:23 crc kubenswrapper[4636]: I1003 14:13:23.839142 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-bundle" (OuterVolumeSpecName: "bundle") pod "a83000c5-7baa-4587-980d-90391869a32c" (UID: "a83000c5-7baa-4587-980d-90391869a32c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:13:23 crc kubenswrapper[4636]: I1003 14:13:23.845209 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a83000c5-7baa-4587-980d-90391869a32c-kube-api-access-pdbgd" (OuterVolumeSpecName: "kube-api-access-pdbgd") pod "a83000c5-7baa-4587-980d-90391869a32c" (UID: "a83000c5-7baa-4587-980d-90391869a32c"). InnerVolumeSpecName "kube-api-access-pdbgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:13:23 crc kubenswrapper[4636]: I1003 14:13:23.853706 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-util" (OuterVolumeSpecName: "util") pod "a83000c5-7baa-4587-980d-90391869a32c" (UID: "a83000c5-7baa-4587-980d-90391869a32c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:13:23 crc kubenswrapper[4636]: I1003 14:13:23.939982 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdbgd\" (UniqueName: \"kubernetes.io/projected/a83000c5-7baa-4587-980d-90391869a32c-kube-api-access-pdbgd\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:23 crc kubenswrapper[4636]: I1003 14:13:23.940021 4636 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-util\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:23 crc kubenswrapper[4636]: I1003 14:13:23.940036 4636 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a83000c5-7baa-4587-980d-90391869a32c-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:13:24 crc kubenswrapper[4636]: I1003 14:13:24.561302 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" event={"ID":"a83000c5-7baa-4587-980d-90391869a32c","Type":"ContainerDied","Data":"66a3197290d51d1bfb50dc8c852a83355f65a32f76ca9cf405c55a31eaaaeee4"} Oct 03 14:13:24 crc kubenswrapper[4636]: I1003 14:13:24.561626 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66a3197290d51d1bfb50dc8c852a83355f65a32f76ca9cf405c55a31eaaaeee4" Oct 03 14:13:24 crc kubenswrapper[4636]: I1003 14:13:24.561346 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj" Oct 03 14:13:27 crc kubenswrapper[4636]: I1003 14:13:27.963359 4636 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.329510 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw"] Oct 03 14:13:33 crc kubenswrapper[4636]: E1003 14:13:33.330022 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83000c5-7baa-4587-980d-90391869a32c" containerName="extract" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.330033 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83000c5-7baa-4587-980d-90391869a32c" containerName="extract" Oct 03 14:13:33 crc kubenswrapper[4636]: E1003 14:13:33.330047 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83000c5-7baa-4587-980d-90391869a32c" containerName="util" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.330052 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83000c5-7baa-4587-980d-90391869a32c" containerName="util" Oct 03 14:13:33 crc kubenswrapper[4636]: E1003 14:13:33.330060 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83000c5-7baa-4587-980d-90391869a32c" containerName="pull" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.330066 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83000c5-7baa-4587-980d-90391869a32c" containerName="pull" Oct 03 14:13:33 crc kubenswrapper[4636]: E1003 14:13:33.330074 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6977d44-d8ff-4d40-959f-024da50c53fe" containerName="console" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.330079 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6977d44-d8ff-4d40-959f-024da50c53fe" containerName="console" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.330190 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83000c5-7baa-4587-980d-90391869a32c" containerName="extract" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.330199 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6977d44-d8ff-4d40-959f-024da50c53fe" containerName="console" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.330668 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.334415 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.336054 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.336398 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7rn8p" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.336651 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.336875 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.355527 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wpg2\" (UniqueName: \"kubernetes.io/projected/4d75cbbf-e22d-49aa-ae40-c77a69421e1a-kube-api-access-2wpg2\") pod \"metallb-operator-controller-manager-79b89cf995-qtfsw\" (UID: \"4d75cbbf-e22d-49aa-ae40-c77a69421e1a\") " pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.355597 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d75cbbf-e22d-49aa-ae40-c77a69421e1a-apiservice-cert\") pod \"metallb-operator-controller-manager-79b89cf995-qtfsw\" (UID: \"4d75cbbf-e22d-49aa-ae40-c77a69421e1a\") " pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.355617 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d75cbbf-e22d-49aa-ae40-c77a69421e1a-webhook-cert\") pod \"metallb-operator-controller-manager-79b89cf995-qtfsw\" (UID: \"4d75cbbf-e22d-49aa-ae40-c77a69421e1a\") " pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.358173 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw"] Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.456912 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d75cbbf-e22d-49aa-ae40-c77a69421e1a-apiservice-cert\") pod \"metallb-operator-controller-manager-79b89cf995-qtfsw\" (UID: \"4d75cbbf-e22d-49aa-ae40-c77a69421e1a\") " pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.456960 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d75cbbf-e22d-49aa-ae40-c77a69421e1a-webhook-cert\") pod \"metallb-operator-controller-manager-79b89cf995-qtfsw\" (UID: \"4d75cbbf-e22d-49aa-ae40-c77a69421e1a\") " pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.457065 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wpg2\" (UniqueName: \"kubernetes.io/projected/4d75cbbf-e22d-49aa-ae40-c77a69421e1a-kube-api-access-2wpg2\") pod \"metallb-operator-controller-manager-79b89cf995-qtfsw\" (UID: \"4d75cbbf-e22d-49aa-ae40-c77a69421e1a\") " pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.465020 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d75cbbf-e22d-49aa-ae40-c77a69421e1a-webhook-cert\") pod \"metallb-operator-controller-manager-79b89cf995-qtfsw\" (UID: \"4d75cbbf-e22d-49aa-ae40-c77a69421e1a\") " pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.479189 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d75cbbf-e22d-49aa-ae40-c77a69421e1a-apiservice-cert\") pod \"metallb-operator-controller-manager-79b89cf995-qtfsw\" (UID: \"4d75cbbf-e22d-49aa-ae40-c77a69421e1a\") " pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.519906 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wpg2\" (UniqueName: \"kubernetes.io/projected/4d75cbbf-e22d-49aa-ae40-c77a69421e1a-kube-api-access-2wpg2\") pod \"metallb-operator-controller-manager-79b89cf995-qtfsw\" (UID: \"4d75cbbf-e22d-49aa-ae40-c77a69421e1a\") " pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.620958 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz"] Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.621588 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.624481 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.625192 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.630617 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-jj42d" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.648167 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.648580 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz"] Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.659537 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vcvt\" (UniqueName: \"kubernetes.io/projected/fdeca3bd-7bca-4463-b480-1b94361da961-kube-api-access-8vcvt\") pod \"metallb-operator-webhook-server-7d746fccb7-rtxlz\" (UID: \"fdeca3bd-7bca-4463-b480-1b94361da961\") " pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.659761 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fdeca3bd-7bca-4463-b480-1b94361da961-apiservice-cert\") pod \"metallb-operator-webhook-server-7d746fccb7-rtxlz\" (UID: \"fdeca3bd-7bca-4463-b480-1b94361da961\") " pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.659933 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fdeca3bd-7bca-4463-b480-1b94361da961-webhook-cert\") pod \"metallb-operator-webhook-server-7d746fccb7-rtxlz\" (UID: \"fdeca3bd-7bca-4463-b480-1b94361da961\") " pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.761290 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fdeca3bd-7bca-4463-b480-1b94361da961-webhook-cert\") pod \"metallb-operator-webhook-server-7d746fccb7-rtxlz\" (UID: \"fdeca3bd-7bca-4463-b480-1b94361da961\") " pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.761590 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vcvt\" (UniqueName: \"kubernetes.io/projected/fdeca3bd-7bca-4463-b480-1b94361da961-kube-api-access-8vcvt\") pod \"metallb-operator-webhook-server-7d746fccb7-rtxlz\" (UID: \"fdeca3bd-7bca-4463-b480-1b94361da961\") " pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.761744 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fdeca3bd-7bca-4463-b480-1b94361da961-apiservice-cert\") pod \"metallb-operator-webhook-server-7d746fccb7-rtxlz\" (UID: \"fdeca3bd-7bca-4463-b480-1b94361da961\") " pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.765730 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fdeca3bd-7bca-4463-b480-1b94361da961-webhook-cert\") pod \"metallb-operator-webhook-server-7d746fccb7-rtxlz\" (UID: \"fdeca3bd-7bca-4463-b480-1b94361da961\") " pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.768390 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fdeca3bd-7bca-4463-b480-1b94361da961-apiservice-cert\") pod \"metallb-operator-webhook-server-7d746fccb7-rtxlz\" (UID: \"fdeca3bd-7bca-4463-b480-1b94361da961\") " pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.782361 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vcvt\" (UniqueName: \"kubernetes.io/projected/fdeca3bd-7bca-4463-b480-1b94361da961-kube-api-access-8vcvt\") pod \"metallb-operator-webhook-server-7d746fccb7-rtxlz\" (UID: \"fdeca3bd-7bca-4463-b480-1b94361da961\") " pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:33 crc kubenswrapper[4636]: I1003 14:13:33.935513 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:34 crc kubenswrapper[4636]: I1003 14:13:34.239770 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw"] Oct 03 14:13:34 crc kubenswrapper[4636]: W1003 14:13:34.252622 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d75cbbf_e22d_49aa_ae40_c77a69421e1a.slice/crio-65c4f2b051939af7454326765bbac85f9c7c33118f0453e835c45447d16c29f1 WatchSource:0}: Error finding container 65c4f2b051939af7454326765bbac85f9c7c33118f0453e835c45447d16c29f1: Status 404 returned error can't find the container with id 65c4f2b051939af7454326765bbac85f9c7c33118f0453e835c45447d16c29f1 Oct 03 14:13:34 crc kubenswrapper[4636]: I1003 14:13:34.426418 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz"] Oct 03 14:13:34 crc kubenswrapper[4636]: I1003 14:13:34.612556 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" event={"ID":"4d75cbbf-e22d-49aa-ae40-c77a69421e1a","Type":"ContainerStarted","Data":"65c4f2b051939af7454326765bbac85f9c7c33118f0453e835c45447d16c29f1"} Oct 03 14:13:34 crc kubenswrapper[4636]: I1003 14:13:34.613484 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" event={"ID":"fdeca3bd-7bca-4463-b480-1b94361da961","Type":"ContainerStarted","Data":"2d0024904c52b1a2a795fe072ee85fd418f649d6f3c723014424232333e4ca61"} Oct 03 14:13:39 crc kubenswrapper[4636]: I1003 14:13:39.163010 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:13:39 crc kubenswrapper[4636]: I1003 14:13:39.163523 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:13:39 crc kubenswrapper[4636]: I1003 14:13:39.660249 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" event={"ID":"fdeca3bd-7bca-4463-b480-1b94361da961","Type":"ContainerStarted","Data":"eb8fee918e317e91129bcb45ad5fd617d5437945dac127650230320b6ca66b01"} Oct 03 14:13:39 crc kubenswrapper[4636]: I1003 14:13:39.661554 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:13:39 crc kubenswrapper[4636]: I1003 14:13:39.664514 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" event={"ID":"4d75cbbf-e22d-49aa-ae40-c77a69421e1a","Type":"ContainerStarted","Data":"e158a6343bfd3f4fcb9e94891ba318e89aa78f9f774b912213de2c530ca100a3"} Oct 03 14:13:39 crc kubenswrapper[4636]: I1003 14:13:39.665266 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:13:39 crc kubenswrapper[4636]: I1003 14:13:39.680940 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" podStartSLOduration=1.7782017890000001 podStartE2EDuration="6.680924133s" podCreationTimestamp="2025-10-03 14:13:33 +0000 UTC" firstStartedPulling="2025-10-03 14:13:34.434953632 +0000 UTC m=+764.293679879" lastFinishedPulling="2025-10-03 14:13:39.337675976 +0000 UTC m=+769.196402223" observedRunningTime="2025-10-03 14:13:39.678667055 +0000 UTC m=+769.537393302" watchObservedRunningTime="2025-10-03 14:13:39.680924133 +0000 UTC m=+769.539650380" Oct 03 14:13:39 crc kubenswrapper[4636]: I1003 14:13:39.702491 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" podStartSLOduration=1.644147429 podStartE2EDuration="6.702471465s" podCreationTimestamp="2025-10-03 14:13:33 +0000 UTC" firstStartedPulling="2025-10-03 14:13:34.25598129 +0000 UTC m=+764.114707537" lastFinishedPulling="2025-10-03 14:13:39.314305326 +0000 UTC m=+769.173031573" observedRunningTime="2025-10-03 14:13:39.699076198 +0000 UTC m=+769.557802455" watchObservedRunningTime="2025-10-03 14:13:39.702471465 +0000 UTC m=+769.561197712" Oct 03 14:13:51 crc kubenswrapper[4636]: I1003 14:13:51.148235 4636 scope.go:117] "RemoveContainer" containerID="3590e607bb939f0376f5f7907e81903fece6dc0ceaa6146bebd6e56508b19c14" Oct 03 14:13:53 crc kubenswrapper[4636]: I1003 14:13:53.938785 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7d746fccb7-rtxlz" Oct 03 14:14:05 crc kubenswrapper[4636]: I1003 14:14:05.926374 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q2x6p"] Oct 03 14:14:05 crc kubenswrapper[4636]: I1003 14:14:05.928147 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:05 crc kubenswrapper[4636]: I1003 14:14:05.959925 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q2x6p"] Oct 03 14:14:05 crc kubenswrapper[4636]: I1003 14:14:05.960215 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwmrg\" (UniqueName: \"kubernetes.io/projected/7dcc53f3-1d33-421f-adb3-2076f86658b1-kube-api-access-rwmrg\") pod \"redhat-operators-q2x6p\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:05 crc kubenswrapper[4636]: I1003 14:14:05.960312 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-utilities\") pod \"redhat-operators-q2x6p\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:05 crc kubenswrapper[4636]: I1003 14:14:05.960438 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-catalog-content\") pod \"redhat-operators-q2x6p\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:06 crc kubenswrapper[4636]: I1003 14:14:06.061571 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-catalog-content\") pod \"redhat-operators-q2x6p\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:06 crc kubenswrapper[4636]: I1003 14:14:06.061638 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmrg\" (UniqueName: \"kubernetes.io/projected/7dcc53f3-1d33-421f-adb3-2076f86658b1-kube-api-access-rwmrg\") pod \"redhat-operators-q2x6p\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:06 crc kubenswrapper[4636]: I1003 14:14:06.061673 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-utilities\") pod \"redhat-operators-q2x6p\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:06 crc kubenswrapper[4636]: I1003 14:14:06.062125 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-utilities\") pod \"redhat-operators-q2x6p\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:06 crc kubenswrapper[4636]: I1003 14:14:06.062181 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-catalog-content\") pod \"redhat-operators-q2x6p\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:06 crc kubenswrapper[4636]: I1003 14:14:06.099238 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwmrg\" (UniqueName: \"kubernetes.io/projected/7dcc53f3-1d33-421f-adb3-2076f86658b1-kube-api-access-rwmrg\") pod \"redhat-operators-q2x6p\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:06 crc kubenswrapper[4636]: I1003 14:14:06.245755 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:06 crc kubenswrapper[4636]: I1003 14:14:06.692529 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q2x6p"] Oct 03 14:14:06 crc kubenswrapper[4636]: I1003 14:14:06.800023 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2x6p" event={"ID":"7dcc53f3-1d33-421f-adb3-2076f86658b1","Type":"ContainerStarted","Data":"1e0064e27fcbec70be80d500723738bc25ea7f7d8870d1a8a787500c778e151c"} Oct 03 14:14:07 crc kubenswrapper[4636]: I1003 14:14:07.806184 4636 generic.go:334] "Generic (PLEG): container finished" podID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerID="7ca443ed09708e006eda5cd02dab6a81bc3221114d7ae4ed5ecb366e755fb58f" exitCode=0 Oct 03 14:14:07 crc kubenswrapper[4636]: I1003 14:14:07.806223 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2x6p" event={"ID":"7dcc53f3-1d33-421f-adb3-2076f86658b1","Type":"ContainerDied","Data":"7ca443ed09708e006eda5cd02dab6a81bc3221114d7ae4ed5ecb366e755fb58f"} Oct 03 14:14:09 crc kubenswrapper[4636]: I1003 14:14:09.163610 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:14:09 crc kubenswrapper[4636]: I1003 14:14:09.163943 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:14:09 crc kubenswrapper[4636]: I1003 14:14:09.163997 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:14:09 crc kubenswrapper[4636]: I1003 14:14:09.164626 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c343b2c3198b919be0641d5d289b1294e3d107e0057a5a4c2427bf1f447e7a9"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:14:09 crc kubenswrapper[4636]: I1003 14:14:09.164691 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://8c343b2c3198b919be0641d5d289b1294e3d107e0057a5a4c2427bf1f447e7a9" gracePeriod=600 Oct 03 14:14:09 crc kubenswrapper[4636]: I1003 14:14:09.819178 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="8c343b2c3198b919be0641d5d289b1294e3d107e0057a5a4c2427bf1f447e7a9" exitCode=0 Oct 03 14:14:09 crc kubenswrapper[4636]: I1003 14:14:09.819221 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"8c343b2c3198b919be0641d5d289b1294e3d107e0057a5a4c2427bf1f447e7a9"} Oct 03 14:14:09 crc kubenswrapper[4636]: I1003 14:14:09.819556 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"07c604f152aa39f3430c1f62789f7be96b3f5a7c96a65ed6157e5d00f0a88d5d"} Oct 03 14:14:09 crc kubenswrapper[4636]: I1003 14:14:09.819583 4636 scope.go:117] "RemoveContainer" containerID="397ecbf6846cc3b94251ba0c02a817d6a89b5e1e5d3d2333691e31cc8372c3fc" Oct 03 14:14:09 crc kubenswrapper[4636]: I1003 14:14:09.822476 4636 generic.go:334] "Generic (PLEG): container finished" podID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerID="4c87a3c94f594b3fcb8a904d3621cfe1e83402789e672de409d554ce9d76af72" exitCode=0 Oct 03 14:14:09 crc kubenswrapper[4636]: I1003 14:14:09.822515 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2x6p" event={"ID":"7dcc53f3-1d33-421f-adb3-2076f86658b1","Type":"ContainerDied","Data":"4c87a3c94f594b3fcb8a904d3621cfe1e83402789e672de409d554ce9d76af72"} Oct 03 14:14:11 crc kubenswrapper[4636]: I1003 14:14:11.837389 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2x6p" event={"ID":"7dcc53f3-1d33-421f-adb3-2076f86658b1","Type":"ContainerStarted","Data":"8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1"} Oct 03 14:14:11 crc kubenswrapper[4636]: I1003 14:14:11.855306 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q2x6p" podStartSLOduration=3.196499764 podStartE2EDuration="6.855290576s" podCreationTimestamp="2025-10-03 14:14:05 +0000 UTC" firstStartedPulling="2025-10-03 14:14:07.807697165 +0000 UTC m=+797.666423412" lastFinishedPulling="2025-10-03 14:14:11.466487977 +0000 UTC m=+801.325214224" observedRunningTime="2025-10-03 14:14:11.854177968 +0000 UTC m=+801.712904215" watchObservedRunningTime="2025-10-03 14:14:11.855290576 +0000 UTC m=+801.714016823" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.522829 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-djwt8"] Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.524086 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.537027 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djwt8"] Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.551090 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-catalog-content\") pod \"community-operators-djwt8\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.551223 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkjvw\" (UniqueName: \"kubernetes.io/projected/fc4c313e-de02-4354-9aed-468571a0ff96-kube-api-access-zkjvw\") pod \"community-operators-djwt8\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.551298 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-utilities\") pod \"community-operators-djwt8\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.653003 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-catalog-content\") pod \"community-operators-djwt8\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.653053 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkjvw\" (UniqueName: \"kubernetes.io/projected/fc4c313e-de02-4354-9aed-468571a0ff96-kube-api-access-zkjvw\") pod \"community-operators-djwt8\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.653091 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-utilities\") pod \"community-operators-djwt8\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.653596 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-utilities\") pod \"community-operators-djwt8\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.653613 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-catalog-content\") pod \"community-operators-djwt8\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.653679 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-79b89cf995-qtfsw" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.671312 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkjvw\" (UniqueName: \"kubernetes.io/projected/fc4c313e-de02-4354-9aed-468571a0ff96-kube-api-access-zkjvw\") pod \"community-operators-djwt8\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:13 crc kubenswrapper[4636]: I1003 14:14:13.845797 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.458553 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djwt8"] Oct 03 14:14:14 crc kubenswrapper[4636]: W1003 14:14:14.465994 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc4c313e_de02_4354_9aed_468571a0ff96.slice/crio-9e6b30c1d87be14435a00cbbb119d31bdcc2c7ac9f1c96a787bac77da8a1e26b WatchSource:0}: Error finding container 9e6b30c1d87be14435a00cbbb119d31bdcc2c7ac9f1c96a787bac77da8a1e26b: Status 404 returned error can't find the container with id 9e6b30c1d87be14435a00cbbb119d31bdcc2c7ac9f1c96a787bac77da8a1e26b Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.485330 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-tzvhr"] Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.487388 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.489213 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-mcxhg" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.489604 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.490301 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.493356 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x"] Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.494239 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.495352 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.526425 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x"] Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.565635 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fb4v\" (UniqueName: \"kubernetes.io/projected/360e3dae-23f1-4ddd-9815-d6a41e611501-kube-api-access-6fb4v\") pod \"frr-k8s-webhook-server-64bf5d555-ttj4x\" (UID: \"360e3dae-23f1-4ddd-9815-d6a41e611501\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.565698 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27812b9a-f947-40fd-a74b-f10fa236e965-metrics-certs\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.565749 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlm45\" (UniqueName: \"kubernetes.io/projected/27812b9a-f947-40fd-a74b-f10fa236e965-kube-api-access-hlm45\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.565774 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-metrics\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.565844 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-frr-sockets\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.565926 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-reloader\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.566025 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/27812b9a-f947-40fd-a74b-f10fa236e965-frr-startup\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.566046 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-frr-conf\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.566059 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/360e3dae-23f1-4ddd-9815-d6a41e611501-cert\") pod \"frr-k8s-webhook-server-64bf5d555-ttj4x\" (UID: \"360e3dae-23f1-4ddd-9815-d6a41e611501\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.656897 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-ggz7j"] Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.658237 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.659975 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-87w8j"] Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.660665 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.663657 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.663902 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.664092 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-m294b" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.664238 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.666390 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.666948 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/27812b9a-f947-40fd-a74b-f10fa236e965-frr-startup\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.667043 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-frr-conf\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.667166 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/360e3dae-23f1-4ddd-9815-d6a41e611501-cert\") pod \"frr-k8s-webhook-server-64bf5d555-ttj4x\" (UID: \"360e3dae-23f1-4ddd-9815-d6a41e611501\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.667254 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fb4v\" (UniqueName: \"kubernetes.io/projected/360e3dae-23f1-4ddd-9815-d6a41e611501-kube-api-access-6fb4v\") pod \"frr-k8s-webhook-server-64bf5d555-ttj4x\" (UID: \"360e3dae-23f1-4ddd-9815-d6a41e611501\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.667323 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27812b9a-f947-40fd-a74b-f10fa236e965-metrics-certs\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.667388 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlm45\" (UniqueName: \"kubernetes.io/projected/27812b9a-f947-40fd-a74b-f10fa236e965-kube-api-access-hlm45\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.667648 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-metrics\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.667724 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-frr-sockets\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.667846 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-reloader\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.668676 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-frr-conf\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.668875 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-metrics\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.669155 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/27812b9a-f947-40fd-a74b-f10fa236e965-frr-startup\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.669416 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-frr-sockets\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.674499 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27812b9a-f947-40fd-a74b-f10fa236e965-metrics-certs\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.681352 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/27812b9a-f947-40fd-a74b-f10fa236e965-reloader\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.697683 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/360e3dae-23f1-4ddd-9815-d6a41e611501-cert\") pod \"frr-k8s-webhook-server-64bf5d555-ttj4x\" (UID: \"360e3dae-23f1-4ddd-9815-d6a41e611501\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.704121 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-87w8j"] Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.713758 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fb4v\" (UniqueName: \"kubernetes.io/projected/360e3dae-23f1-4ddd-9815-d6a41e611501-kube-api-access-6fb4v\") pod \"frr-k8s-webhook-server-64bf5d555-ttj4x\" (UID: \"360e3dae-23f1-4ddd-9815-d6a41e611501\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.722675 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlm45\" (UniqueName: \"kubernetes.io/projected/27812b9a-f947-40fd-a74b-f10fa236e965-kube-api-access-hlm45\") pod \"frr-k8s-tzvhr\" (UID: \"27812b9a-f947-40fd-a74b-f10fa236e965\") " pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.769040 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-cert\") pod \"controller-68d546b9d8-87w8j\" (UID: \"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c\") " pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.769127 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-metrics-certs\") pod \"controller-68d546b9d8-87w8j\" (UID: \"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c\") " pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.769185 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhbhr\" (UniqueName: \"kubernetes.io/projected/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-kube-api-access-jhbhr\") pod \"controller-68d546b9d8-87w8j\" (UID: \"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c\") " pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.769213 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/39a6b95f-24cf-4365-93c0-b47b7a7672fb-metallb-excludel2\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.769237 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-memberlist\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.769281 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-metrics-certs\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.769310 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqc9q\" (UniqueName: \"kubernetes.io/projected/39a6b95f-24cf-4365-93c0-b47b7a7672fb-kube-api-access-vqc9q\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.853719 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djwt8" event={"ID":"fc4c313e-de02-4354-9aed-468571a0ff96","Type":"ContainerStarted","Data":"ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db"} Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.853768 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djwt8" event={"ID":"fc4c313e-de02-4354-9aed-468571a0ff96","Type":"ContainerStarted","Data":"9e6b30c1d87be14435a00cbbb119d31bdcc2c7ac9f1c96a787bac77da8a1e26b"} Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.870614 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-metrics-certs\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.870678 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqc9q\" (UniqueName: \"kubernetes.io/projected/39a6b95f-24cf-4365-93c0-b47b7a7672fb-kube-api-access-vqc9q\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.870738 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-cert\") pod \"controller-68d546b9d8-87w8j\" (UID: \"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c\") " pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.870778 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-metrics-certs\") pod \"controller-68d546b9d8-87w8j\" (UID: \"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c\") " pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.870815 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhbhr\" (UniqueName: \"kubernetes.io/projected/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-kube-api-access-jhbhr\") pod \"controller-68d546b9d8-87w8j\" (UID: \"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c\") " pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.870841 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/39a6b95f-24cf-4365-93c0-b47b7a7672fb-metallb-excludel2\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.870863 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-memberlist\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: E1003 14:14:14.871362 4636 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 03 14:14:14 crc kubenswrapper[4636]: E1003 14:14:14.871422 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-metrics-certs podName:f5c8bfd9-03d0-45ec-825a-d0c8f613c29c nodeName:}" failed. No retries permitted until 2025-10-03 14:14:15.37140627 +0000 UTC m=+805.230132517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-metrics-certs") pod "controller-68d546b9d8-87w8j" (UID: "f5c8bfd9-03d0-45ec-825a-d0c8f613c29c") : secret "controller-certs-secret" not found Oct 03 14:14:14 crc kubenswrapper[4636]: E1003 14:14:14.871601 4636 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 14:14:14 crc kubenswrapper[4636]: E1003 14:14:14.871632 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-memberlist podName:39a6b95f-24cf-4365-93c0-b47b7a7672fb nodeName:}" failed. No retries permitted until 2025-10-03 14:14:15.371623685 +0000 UTC m=+805.230350072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-memberlist") pod "speaker-ggz7j" (UID: "39a6b95f-24cf-4365-93c0-b47b7a7672fb") : secret "metallb-memberlist" not found Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.872145 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/39a6b95f-24cf-4365-93c0-b47b7a7672fb-metallb-excludel2\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.874225 4636 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.874729 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-metrics-certs\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.879858 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.885692 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.885706 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-cert\") pod \"controller-68d546b9d8-87w8j\" (UID: \"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c\") " pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.895745 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqc9q\" (UniqueName: \"kubernetes.io/projected/39a6b95f-24cf-4365-93c0-b47b7a7672fb-kube-api-access-vqc9q\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:14 crc kubenswrapper[4636]: I1003 14:14:14.903361 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhbhr\" (UniqueName: \"kubernetes.io/projected/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-kube-api-access-jhbhr\") pod \"controller-68d546b9d8-87w8j\" (UID: \"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c\") " pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:15 crc kubenswrapper[4636]: I1003 14:14:15.109575 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x"] Oct 03 14:14:15 crc kubenswrapper[4636]: W1003 14:14:15.114352 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod360e3dae_23f1_4ddd_9815_d6a41e611501.slice/crio-de754e6786be8503a5f81d6dca84b512c46185914052f50a9a9982479bd8f757 WatchSource:0}: Error finding container de754e6786be8503a5f81d6dca84b512c46185914052f50a9a9982479bd8f757: Status 404 returned error can't find the container with id de754e6786be8503a5f81d6dca84b512c46185914052f50a9a9982479bd8f757 Oct 03 14:14:15 crc kubenswrapper[4636]: I1003 14:14:15.375453 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-metrics-certs\") pod \"controller-68d546b9d8-87w8j\" (UID: \"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c\") " pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:15 crc kubenswrapper[4636]: I1003 14:14:15.375511 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-memberlist\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:15 crc kubenswrapper[4636]: E1003 14:14:15.375664 4636 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 14:14:15 crc kubenswrapper[4636]: E1003 14:14:15.375716 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-memberlist podName:39a6b95f-24cf-4365-93c0-b47b7a7672fb nodeName:}" failed. No retries permitted until 2025-10-03 14:14:16.375701652 +0000 UTC m=+806.234427899 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-memberlist") pod "speaker-ggz7j" (UID: "39a6b95f-24cf-4365-93c0-b47b7a7672fb") : secret "metallb-memberlist" not found Oct 03 14:14:15 crc kubenswrapper[4636]: I1003 14:14:15.383652 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5c8bfd9-03d0-45ec-825a-d0c8f613c29c-metrics-certs\") pod \"controller-68d546b9d8-87w8j\" (UID: \"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c\") " pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:15 crc kubenswrapper[4636]: I1003 14:14:15.645303 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:15 crc kubenswrapper[4636]: I1003 14:14:15.863714 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" event={"ID":"360e3dae-23f1-4ddd-9815-d6a41e611501","Type":"ContainerStarted","Data":"de754e6786be8503a5f81d6dca84b512c46185914052f50a9a9982479bd8f757"} Oct 03 14:14:15 crc kubenswrapper[4636]: I1003 14:14:15.871817 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tzvhr" event={"ID":"27812b9a-f947-40fd-a74b-f10fa236e965","Type":"ContainerStarted","Data":"cf4b745d9b051e772e6b0f0eed7532e4f5d0954ffd014801c44025620b6af283"} Oct 03 14:14:15 crc kubenswrapper[4636]: I1003 14:14:15.874042 4636 generic.go:334] "Generic (PLEG): container finished" podID="fc4c313e-de02-4354-9aed-468571a0ff96" containerID="ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db" exitCode=0 Oct 03 14:14:15 crc kubenswrapper[4636]: I1003 14:14:15.874071 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djwt8" event={"ID":"fc4c313e-de02-4354-9aed-468571a0ff96","Type":"ContainerDied","Data":"ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db"} Oct 03 14:14:16 crc kubenswrapper[4636]: I1003 14:14:16.068696 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-87w8j"] Oct 03 14:14:16 crc kubenswrapper[4636]: W1003 14:14:16.072352 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c8bfd9_03d0_45ec_825a_d0c8f613c29c.slice/crio-d57d495c3d75ac1aa4a8b046b3ef1df0701de0927c1da5d21e343abc872311e7 WatchSource:0}: Error finding container d57d495c3d75ac1aa4a8b046b3ef1df0701de0927c1da5d21e343abc872311e7: Status 404 returned error can't find the container with id d57d495c3d75ac1aa4a8b046b3ef1df0701de0927c1da5d21e343abc872311e7 Oct 03 14:14:16 crc kubenswrapper[4636]: I1003 14:14:16.246303 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:16 crc kubenswrapper[4636]: I1003 14:14:16.246418 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:16 crc kubenswrapper[4636]: I1003 14:14:16.389196 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-memberlist\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:16 crc kubenswrapper[4636]: I1003 14:14:16.395092 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/39a6b95f-24cf-4365-93c0-b47b7a7672fb-memberlist\") pod \"speaker-ggz7j\" (UID: \"39a6b95f-24cf-4365-93c0-b47b7a7672fb\") " pod="metallb-system/speaker-ggz7j" Oct 03 14:14:16 crc kubenswrapper[4636]: I1003 14:14:16.480604 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ggz7j" Oct 03 14:14:16 crc kubenswrapper[4636]: I1003 14:14:16.885971 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-87w8j" event={"ID":"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c","Type":"ContainerStarted","Data":"ad54170128219270ed8a9cf9130d7d5e8e96ef8dfac1a22e9d8de764c45f3ac9"} Oct 03 14:14:16 crc kubenswrapper[4636]: I1003 14:14:16.886280 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-87w8j" event={"ID":"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c","Type":"ContainerStarted","Data":"41cb3035122bb03a7279a0b3dbd269365829d17008789e97399134699d30660f"} Oct 03 14:14:16 crc kubenswrapper[4636]: I1003 14:14:16.886291 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-87w8j" event={"ID":"f5c8bfd9-03d0-45ec-825a-d0c8f613c29c","Type":"ContainerStarted","Data":"d57d495c3d75ac1aa4a8b046b3ef1df0701de0927c1da5d21e343abc872311e7"} Oct 03 14:14:16 crc kubenswrapper[4636]: I1003 14:14:16.887238 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ggz7j" event={"ID":"39a6b95f-24cf-4365-93c0-b47b7a7672fb","Type":"ContainerStarted","Data":"248b7cda4b2078a9f43d7700ddcd3484c36b94d836df58805fd908cd3ae53bd4"} Oct 03 14:14:17 crc kubenswrapper[4636]: I1003 14:14:17.304744 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q2x6p" podUID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerName="registry-server" probeResult="failure" output=< Oct 03 14:14:17 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 14:14:17 crc kubenswrapper[4636]: > Oct 03 14:14:17 crc kubenswrapper[4636]: I1003 14:14:17.895928 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ggz7j" event={"ID":"39a6b95f-24cf-4365-93c0-b47b7a7672fb","Type":"ContainerStarted","Data":"1b80facde6512403e513e35313e064ad07efffbc8a179f41e82c3daf010e083c"} Oct 03 14:14:17 crc kubenswrapper[4636]: I1003 14:14:17.905087 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djwt8" event={"ID":"fc4c313e-de02-4354-9aed-468571a0ff96","Type":"ContainerStarted","Data":"869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41"} Oct 03 14:14:17 crc kubenswrapper[4636]: I1003 14:14:17.905164 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:17 crc kubenswrapper[4636]: I1003 14:14:17.961675 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-87w8j" podStartSLOduration=3.961639093 podStartE2EDuration="3.961639093s" podCreationTimestamp="2025-10-03 14:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:14:17.935122017 +0000 UTC m=+807.793848274" watchObservedRunningTime="2025-10-03 14:14:17.961639093 +0000 UTC m=+807.820365340" Oct 03 14:14:18 crc kubenswrapper[4636]: I1003 14:14:18.918732 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ggz7j" event={"ID":"39a6b95f-24cf-4365-93c0-b47b7a7672fb","Type":"ContainerStarted","Data":"dadc02bcd13dbe16c298b75a19c1d6f4ab0700b900e85f91a9cb935b72a673bd"} Oct 03 14:14:18 crc kubenswrapper[4636]: I1003 14:14:18.919057 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-ggz7j" Oct 03 14:14:18 crc kubenswrapper[4636]: I1003 14:14:18.925733 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djwt8" event={"ID":"fc4c313e-de02-4354-9aed-468571a0ff96","Type":"ContainerDied","Data":"869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41"} Oct 03 14:14:18 crc kubenswrapper[4636]: I1003 14:14:18.926126 4636 generic.go:334] "Generic (PLEG): container finished" podID="fc4c313e-de02-4354-9aed-468571a0ff96" containerID="869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41" exitCode=0 Oct 03 14:14:18 crc kubenswrapper[4636]: I1003 14:14:18.938744 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-ggz7j" podStartSLOduration=4.938728393 podStartE2EDuration="4.938728393s" podCreationTimestamp="2025-10-03 14:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:14:18.937154493 +0000 UTC m=+808.795880740" watchObservedRunningTime="2025-10-03 14:14:18.938728393 +0000 UTC m=+808.797454640" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.311040 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8tf7v"] Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.312121 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.328282 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8tf7v"] Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.429932 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-catalog-content\") pod \"redhat-marketplace-8tf7v\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.430018 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr9vx\" (UniqueName: \"kubernetes.io/projected/9721832c-e1e9-49f9-96ab-253319b635bf-kube-api-access-cr9vx\") pod \"redhat-marketplace-8tf7v\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.430082 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-utilities\") pod \"redhat-marketplace-8tf7v\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.531767 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-utilities\") pod \"redhat-marketplace-8tf7v\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.532141 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-catalog-content\") pod \"redhat-marketplace-8tf7v\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.532199 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr9vx\" (UniqueName: \"kubernetes.io/projected/9721832c-e1e9-49f9-96ab-253319b635bf-kube-api-access-cr9vx\") pod \"redhat-marketplace-8tf7v\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.532767 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-utilities\") pod \"redhat-marketplace-8tf7v\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.533014 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-catalog-content\") pod \"redhat-marketplace-8tf7v\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.557360 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr9vx\" (UniqueName: \"kubernetes.io/projected/9721832c-e1e9-49f9-96ab-253319b635bf-kube-api-access-cr9vx\") pod \"redhat-marketplace-8tf7v\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.646781 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.940550 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djwt8" event={"ID":"fc4c313e-de02-4354-9aed-468571a0ff96","Type":"ContainerStarted","Data":"452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106"} Oct 03 14:14:19 crc kubenswrapper[4636]: I1003 14:14:19.963455 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-djwt8" podStartSLOduration=3.452646617 podStartE2EDuration="6.963433298s" podCreationTimestamp="2025-10-03 14:14:13 +0000 UTC" firstStartedPulling="2025-10-03 14:14:15.876934905 +0000 UTC m=+805.735661162" lastFinishedPulling="2025-10-03 14:14:19.387721596 +0000 UTC m=+809.246447843" observedRunningTime="2025-10-03 14:14:19.960642477 +0000 UTC m=+809.819368734" watchObservedRunningTime="2025-10-03 14:14:19.963433298 +0000 UTC m=+809.822159545" Oct 03 14:14:20 crc kubenswrapper[4636]: I1003 14:14:20.149352 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8tf7v"] Oct 03 14:14:20 crc kubenswrapper[4636]: I1003 14:14:20.950741 4636 generic.go:334] "Generic (PLEG): container finished" podID="9721832c-e1e9-49f9-96ab-253319b635bf" containerID="4342c7ac942e7d228d413204f585d09b639d29552d5d0911063d791fade65a37" exitCode=0 Oct 03 14:14:20 crc kubenswrapper[4636]: I1003 14:14:20.950798 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tf7v" event={"ID":"9721832c-e1e9-49f9-96ab-253319b635bf","Type":"ContainerDied","Data":"4342c7ac942e7d228d413204f585d09b639d29552d5d0911063d791fade65a37"} Oct 03 14:14:20 crc kubenswrapper[4636]: I1003 14:14:20.951072 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tf7v" event={"ID":"9721832c-e1e9-49f9-96ab-253319b635bf","Type":"ContainerStarted","Data":"56ceac41575bb84a54bdad554ce3d117068bf8c70c87146c0937e8b2e71a6b0c"} Oct 03 14:14:23 crc kubenswrapper[4636]: I1003 14:14:23.846871 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:23 crc kubenswrapper[4636]: I1003 14:14:23.847247 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:23 crc kubenswrapper[4636]: I1003 14:14:23.898718 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:25 crc kubenswrapper[4636]: I1003 14:14:25.999067 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" event={"ID":"360e3dae-23f1-4ddd-9815-d6a41e611501","Type":"ContainerStarted","Data":"5147d789c4db33e282cabde09c267402c4142c42f77c2374948d4c3a93aa3325"} Oct 03 14:14:25 crc kubenswrapper[4636]: I1003 14:14:25.999406 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" Oct 03 14:14:26 crc kubenswrapper[4636]: I1003 14:14:26.001427 4636 generic.go:334] "Generic (PLEG): container finished" podID="27812b9a-f947-40fd-a74b-f10fa236e965" containerID="d915de8779fc68ebadeb9375267bbe54ecfba5f0e265c8eee3908a3b9fca50d0" exitCode=0 Oct 03 14:14:26 crc kubenswrapper[4636]: I1003 14:14:26.001453 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tzvhr" event={"ID":"27812b9a-f947-40fd-a74b-f10fa236e965","Type":"ContainerDied","Data":"d915de8779fc68ebadeb9375267bbe54ecfba5f0e265c8eee3908a3b9fca50d0"} Oct 03 14:14:26 crc kubenswrapper[4636]: I1003 14:14:26.018976 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" podStartSLOduration=1.4152532899999999 podStartE2EDuration="12.01895862s" podCreationTimestamp="2025-10-03 14:14:14 +0000 UTC" firstStartedPulling="2025-10-03 14:14:15.118461486 +0000 UTC m=+804.977187733" lastFinishedPulling="2025-10-03 14:14:25.722166826 +0000 UTC m=+815.580893063" observedRunningTime="2025-10-03 14:14:26.017568884 +0000 UTC m=+815.876295131" watchObservedRunningTime="2025-10-03 14:14:26.01895862 +0000 UTC m=+815.877684877" Oct 03 14:14:26 crc kubenswrapper[4636]: I1003 14:14:26.285245 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:26 crc kubenswrapper[4636]: I1003 14:14:26.351156 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:26 crc kubenswrapper[4636]: I1003 14:14:26.511625 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q2x6p"] Oct 03 14:14:27 crc kubenswrapper[4636]: I1003 14:14:27.013243 4636 generic.go:334] "Generic (PLEG): container finished" podID="9721832c-e1e9-49f9-96ab-253319b635bf" containerID="488c91764770f931c20336d134b8370394821a0bcb44a48cf144fd4db4b35717" exitCode=0 Oct 03 14:14:27 crc kubenswrapper[4636]: I1003 14:14:27.013328 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tf7v" event={"ID":"9721832c-e1e9-49f9-96ab-253319b635bf","Type":"ContainerDied","Data":"488c91764770f931c20336d134b8370394821a0bcb44a48cf144fd4db4b35717"} Oct 03 14:14:27 crc kubenswrapper[4636]: I1003 14:14:27.016525 4636 generic.go:334] "Generic (PLEG): container finished" podID="27812b9a-f947-40fd-a74b-f10fa236e965" containerID="eb512de632ac1822b3ff88c9ed1008ff400267cddf1b63034d76370ad282cca3" exitCode=0 Oct 03 14:14:27 crc kubenswrapper[4636]: I1003 14:14:27.017729 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tzvhr" event={"ID":"27812b9a-f947-40fd-a74b-f10fa236e965","Type":"ContainerDied","Data":"eb512de632ac1822b3ff88c9ed1008ff400267cddf1b63034d76370ad282cca3"} Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.022388 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tf7v" event={"ID":"9721832c-e1e9-49f9-96ab-253319b635bf","Type":"ContainerStarted","Data":"704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721"} Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.026035 4636 generic.go:334] "Generic (PLEG): container finished" podID="27812b9a-f947-40fd-a74b-f10fa236e965" containerID="10bedf9a69dd7081c8a1054f4a3eed58a819c4b52415336d5962a6550e05ac46" exitCode=0 Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.026115 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tzvhr" event={"ID":"27812b9a-f947-40fd-a74b-f10fa236e965","Type":"ContainerDied","Data":"10bedf9a69dd7081c8a1054f4a3eed58a819c4b52415336d5962a6550e05ac46"} Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.026432 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q2x6p" podUID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerName="registry-server" containerID="cri-o://8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1" gracePeriod=2 Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.088267 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8tf7v" podStartSLOduration=7.181376509 podStartE2EDuration="9.088251725s" podCreationTimestamp="2025-10-03 14:14:19 +0000 UTC" firstStartedPulling="2025-10-03 14:14:25.637125369 +0000 UTC m=+815.495851616" lastFinishedPulling="2025-10-03 14:14:27.544000595 +0000 UTC m=+817.402726832" observedRunningTime="2025-10-03 14:14:28.062355405 +0000 UTC m=+817.921081652" watchObservedRunningTime="2025-10-03 14:14:28.088251725 +0000 UTC m=+817.946977972" Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.470588 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.650173 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-utilities\") pod \"7dcc53f3-1d33-421f-adb3-2076f86658b1\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.650221 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-catalog-content\") pod \"7dcc53f3-1d33-421f-adb3-2076f86658b1\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.650295 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwmrg\" (UniqueName: \"kubernetes.io/projected/7dcc53f3-1d33-421f-adb3-2076f86658b1-kube-api-access-rwmrg\") pod \"7dcc53f3-1d33-421f-adb3-2076f86658b1\" (UID: \"7dcc53f3-1d33-421f-adb3-2076f86658b1\") " Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.651577 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-utilities" (OuterVolumeSpecName: "utilities") pod "7dcc53f3-1d33-421f-adb3-2076f86658b1" (UID: "7dcc53f3-1d33-421f-adb3-2076f86658b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.657229 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dcc53f3-1d33-421f-adb3-2076f86658b1-kube-api-access-rwmrg" (OuterVolumeSpecName: "kube-api-access-rwmrg") pod "7dcc53f3-1d33-421f-adb3-2076f86658b1" (UID: "7dcc53f3-1d33-421f-adb3-2076f86658b1"). InnerVolumeSpecName "kube-api-access-rwmrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.737593 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7dcc53f3-1d33-421f-adb3-2076f86658b1" (UID: "7dcc53f3-1d33-421f-adb3-2076f86658b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.751846 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.751885 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dcc53f3-1d33-421f-adb3-2076f86658b1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:28 crc kubenswrapper[4636]: I1003 14:14:28.751898 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwmrg\" (UniqueName: \"kubernetes.io/projected/7dcc53f3-1d33-421f-adb3-2076f86658b1-kube-api-access-rwmrg\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.036987 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tzvhr" event={"ID":"27812b9a-f947-40fd-a74b-f10fa236e965","Type":"ContainerStarted","Data":"88b2eb7cf148c35dae18245e9b4cbd6d506d2bbc6f6a608c16c7d17ba52461e6"} Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.037029 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tzvhr" event={"ID":"27812b9a-f947-40fd-a74b-f10fa236e965","Type":"ContainerStarted","Data":"48f6f019168c3b8c4500c1ee7011b18122eb12a587dd4a81644e378386c82b78"} Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.037043 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tzvhr" event={"ID":"27812b9a-f947-40fd-a74b-f10fa236e965","Type":"ContainerStarted","Data":"35e8e2fda520213911e648ce14696921ad19c28ef6d6e5cc20079287e01296fd"} Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.037053 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tzvhr" event={"ID":"27812b9a-f947-40fd-a74b-f10fa236e965","Type":"ContainerStarted","Data":"e68a7c2a32b34e964d842fd03f91245779c297d66bb10e66b0c2dccad414a798"} Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.037064 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tzvhr" event={"ID":"27812b9a-f947-40fd-a74b-f10fa236e965","Type":"ContainerStarted","Data":"13403c9e2805c51a50e0df2b8673e0f45806ba3efc0e6441a48f5748f653d463"} Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.038472 4636 generic.go:334] "Generic (PLEG): container finished" podID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerID="8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1" exitCode=0 Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.038534 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2x6p" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.038546 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2x6p" event={"ID":"7dcc53f3-1d33-421f-adb3-2076f86658b1","Type":"ContainerDied","Data":"8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1"} Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.038571 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2x6p" event={"ID":"7dcc53f3-1d33-421f-adb3-2076f86658b1","Type":"ContainerDied","Data":"1e0064e27fcbec70be80d500723738bc25ea7f7d8870d1a8a787500c778e151c"} Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.038588 4636 scope.go:117] "RemoveContainer" containerID="8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.060347 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q2x6p"] Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.061886 4636 scope.go:117] "RemoveContainer" containerID="4c87a3c94f594b3fcb8a904d3621cfe1e83402789e672de409d554ce9d76af72" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.065111 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q2x6p"] Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.117871 4636 scope.go:117] "RemoveContainer" containerID="7ca443ed09708e006eda5cd02dab6a81bc3221114d7ae4ed5ecb366e755fb58f" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.138709 4636 scope.go:117] "RemoveContainer" containerID="8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1" Oct 03 14:14:29 crc kubenswrapper[4636]: E1003 14:14:29.139459 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1\": container with ID starting with 8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1 not found: ID does not exist" containerID="8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.139505 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1"} err="failed to get container status \"8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1\": rpc error: code = NotFound desc = could not find container \"8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1\": container with ID starting with 8538e7468d01eee33c5663d5c9cfe17062ac6166bb700858128a8b36c4d169a1 not found: ID does not exist" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.139533 4636 scope.go:117] "RemoveContainer" containerID="4c87a3c94f594b3fcb8a904d3621cfe1e83402789e672de409d554ce9d76af72" Oct 03 14:14:29 crc kubenswrapper[4636]: E1003 14:14:29.139899 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c87a3c94f594b3fcb8a904d3621cfe1e83402789e672de409d554ce9d76af72\": container with ID starting with 4c87a3c94f594b3fcb8a904d3621cfe1e83402789e672de409d554ce9d76af72 not found: ID does not exist" containerID="4c87a3c94f594b3fcb8a904d3621cfe1e83402789e672de409d554ce9d76af72" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.139930 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c87a3c94f594b3fcb8a904d3621cfe1e83402789e672de409d554ce9d76af72"} err="failed to get container status \"4c87a3c94f594b3fcb8a904d3621cfe1e83402789e672de409d554ce9d76af72\": rpc error: code = NotFound desc = could not find container \"4c87a3c94f594b3fcb8a904d3621cfe1e83402789e672de409d554ce9d76af72\": container with ID starting with 4c87a3c94f594b3fcb8a904d3621cfe1e83402789e672de409d554ce9d76af72 not found: ID does not exist" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.139974 4636 scope.go:117] "RemoveContainer" containerID="7ca443ed09708e006eda5cd02dab6a81bc3221114d7ae4ed5ecb366e755fb58f" Oct 03 14:14:29 crc kubenswrapper[4636]: E1003 14:14:29.140317 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca443ed09708e006eda5cd02dab6a81bc3221114d7ae4ed5ecb366e755fb58f\": container with ID starting with 7ca443ed09708e006eda5cd02dab6a81bc3221114d7ae4ed5ecb366e755fb58f not found: ID does not exist" containerID="7ca443ed09708e006eda5cd02dab6a81bc3221114d7ae4ed5ecb366e755fb58f" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.140338 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca443ed09708e006eda5cd02dab6a81bc3221114d7ae4ed5ecb366e755fb58f"} err="failed to get container status \"7ca443ed09708e006eda5cd02dab6a81bc3221114d7ae4ed5ecb366e755fb58f\": rpc error: code = NotFound desc = could not find container \"7ca443ed09708e006eda5cd02dab6a81bc3221114d7ae4ed5ecb366e755fb58f\": container with ID starting with 7ca443ed09708e006eda5cd02dab6a81bc3221114d7ae4ed5ecb366e755fb58f not found: ID does not exist" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.647505 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.647542 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:29 crc kubenswrapper[4636]: I1003 14:14:29.717413 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:30 crc kubenswrapper[4636]: I1003 14:14:30.047527 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tzvhr" event={"ID":"27812b9a-f947-40fd-a74b-f10fa236e965","Type":"ContainerStarted","Data":"15740553babeb5ac4ba1df48af411e56de88dccd0ba8350bc22e67d21c5d7bb0"} Oct 03 14:14:30 crc kubenswrapper[4636]: I1003 14:14:30.048506 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:30 crc kubenswrapper[4636]: I1003 14:14:30.067601 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-tzvhr" podStartSLOduration=5.353132764 podStartE2EDuration="16.067587687s" podCreationTimestamp="2025-10-03 14:14:14 +0000 UTC" firstStartedPulling="2025-10-03 14:14:15.021758601 +0000 UTC m=+804.880484848" lastFinishedPulling="2025-10-03 14:14:25.736213524 +0000 UTC m=+815.594939771" observedRunningTime="2025-10-03 14:14:30.066554031 +0000 UTC m=+819.925280288" watchObservedRunningTime="2025-10-03 14:14:30.067587687 +0000 UTC m=+819.926313934" Oct 03 14:14:30 crc kubenswrapper[4636]: I1003 14:14:30.802679 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dcc53f3-1d33-421f-adb3-2076f86658b1" path="/var/lib/kubelet/pods/7dcc53f3-1d33-421f-adb3-2076f86658b1/volumes" Oct 03 14:14:33 crc kubenswrapper[4636]: I1003 14:14:33.885069 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:33 crc kubenswrapper[4636]: I1003 14:14:33.925292 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qtnbl"] Oct 03 14:14:33 crc kubenswrapper[4636]: E1003 14:14:33.925600 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerName="extract-utilities" Oct 03 14:14:33 crc kubenswrapper[4636]: I1003 14:14:33.925622 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerName="extract-utilities" Oct 03 14:14:33 crc kubenswrapper[4636]: E1003 14:14:33.925632 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerName="extract-content" Oct 03 14:14:33 crc kubenswrapper[4636]: I1003 14:14:33.925640 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerName="extract-content" Oct 03 14:14:33 crc kubenswrapper[4636]: E1003 14:14:33.925648 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerName="registry-server" Oct 03 14:14:33 crc kubenswrapper[4636]: I1003 14:14:33.925655 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerName="registry-server" Oct 03 14:14:33 crc kubenswrapper[4636]: I1003 14:14:33.925790 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcc53f3-1d33-421f-adb3-2076f86658b1" containerName="registry-server" Oct 03 14:14:33 crc kubenswrapper[4636]: I1003 14:14:33.926736 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:33 crc kubenswrapper[4636]: I1003 14:14:33.938432 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qtnbl"] Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.019365 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfwss\" (UniqueName: \"kubernetes.io/projected/993e8814-c119-4771-91ac-c0cbc352cbdc-kube-api-access-bfwss\") pod \"certified-operators-qtnbl\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.019429 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-utilities\") pod \"certified-operators-qtnbl\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.019510 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-catalog-content\") pod \"certified-operators-qtnbl\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.120848 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-catalog-content\") pod \"certified-operators-qtnbl\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.120925 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfwss\" (UniqueName: \"kubernetes.io/projected/993e8814-c119-4771-91ac-c0cbc352cbdc-kube-api-access-bfwss\") pod \"certified-operators-qtnbl\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.120952 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-utilities\") pod \"certified-operators-qtnbl\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.121512 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-utilities\") pod \"certified-operators-qtnbl\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.121520 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-catalog-content\") pod \"certified-operators-qtnbl\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.155936 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfwss\" (UniqueName: \"kubernetes.io/projected/993e8814-c119-4771-91ac-c0cbc352cbdc-kube-api-access-bfwss\") pod \"certified-operators-qtnbl\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.243249 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.765485 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qtnbl"] Oct 03 14:14:34 crc kubenswrapper[4636]: W1003 14:14:34.777317 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod993e8814_c119_4771_91ac_c0cbc352cbdc.slice/crio-8124a4791eda9e78f22292aa24c0701a9aef81fdd6cbdea7e2adfb72542b0f1c WatchSource:0}: Error finding container 8124a4791eda9e78f22292aa24c0701a9aef81fdd6cbdea7e2adfb72542b0f1c: Status 404 returned error can't find the container with id 8124a4791eda9e78f22292aa24c0701a9aef81fdd6cbdea7e2adfb72542b0f1c Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.881428 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:34 crc kubenswrapper[4636]: I1003 14:14:34.918564 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:35 crc kubenswrapper[4636]: I1003 14:14:35.081324 4636 generic.go:334] "Generic (PLEG): container finished" podID="993e8814-c119-4771-91ac-c0cbc352cbdc" containerID="437a223100c587baaf37c7a11a5a7cf27ed2b2c1ec86cfa20a3db9321f0d4dbd" exitCode=0 Oct 03 14:14:35 crc kubenswrapper[4636]: I1003 14:14:35.081380 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtnbl" event={"ID":"993e8814-c119-4771-91ac-c0cbc352cbdc","Type":"ContainerDied","Data":"437a223100c587baaf37c7a11a5a7cf27ed2b2c1ec86cfa20a3db9321f0d4dbd"} Oct 03 14:14:35 crc kubenswrapper[4636]: I1003 14:14:35.082990 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtnbl" event={"ID":"993e8814-c119-4771-91ac-c0cbc352cbdc","Type":"ContainerStarted","Data":"8124a4791eda9e78f22292aa24c0701a9aef81fdd6cbdea7e2adfb72542b0f1c"} Oct 03 14:14:35 crc kubenswrapper[4636]: I1003 14:14:35.651219 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-87w8j" Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.113544 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djwt8"] Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.114438 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-djwt8" podUID="fc4c313e-de02-4354-9aed-468571a0ff96" containerName="registry-server" containerID="cri-o://452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106" gracePeriod=2 Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.484247 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-ggz7j" Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.507943 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.663957 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-catalog-content\") pod \"fc4c313e-de02-4354-9aed-468571a0ff96\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.664052 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkjvw\" (UniqueName: \"kubernetes.io/projected/fc4c313e-de02-4354-9aed-468571a0ff96-kube-api-access-zkjvw\") pod \"fc4c313e-de02-4354-9aed-468571a0ff96\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.664073 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-utilities\") pod \"fc4c313e-de02-4354-9aed-468571a0ff96\" (UID: \"fc4c313e-de02-4354-9aed-468571a0ff96\") " Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.665070 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-utilities" (OuterVolumeSpecName: "utilities") pod "fc4c313e-de02-4354-9aed-468571a0ff96" (UID: "fc4c313e-de02-4354-9aed-468571a0ff96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.673085 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc4c313e-de02-4354-9aed-468571a0ff96-kube-api-access-zkjvw" (OuterVolumeSpecName: "kube-api-access-zkjvw") pod "fc4c313e-de02-4354-9aed-468571a0ff96" (UID: "fc4c313e-de02-4354-9aed-468571a0ff96"). InnerVolumeSpecName "kube-api-access-zkjvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.705565 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc4c313e-de02-4354-9aed-468571a0ff96" (UID: "fc4c313e-de02-4354-9aed-468571a0ff96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.766306 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkjvw\" (UniqueName: \"kubernetes.io/projected/fc4c313e-de02-4354-9aed-468571a0ff96-kube-api-access-zkjvw\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.766338 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:36 crc kubenswrapper[4636]: I1003 14:14:36.766348 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc4c313e-de02-4354-9aed-468571a0ff96-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.095625 4636 generic.go:334] "Generic (PLEG): container finished" podID="993e8814-c119-4771-91ac-c0cbc352cbdc" containerID="f897dac717c6aaf5e3d23a9bea0c5d193b4c520bbb5f54db30e409e3cab348ae" exitCode=0 Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.095776 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtnbl" event={"ID":"993e8814-c119-4771-91ac-c0cbc352cbdc","Type":"ContainerDied","Data":"f897dac717c6aaf5e3d23a9bea0c5d193b4c520bbb5f54db30e409e3cab348ae"} Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.102447 4636 generic.go:334] "Generic (PLEG): container finished" podID="fc4c313e-de02-4354-9aed-468571a0ff96" containerID="452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106" exitCode=0 Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.102591 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djwt8" event={"ID":"fc4c313e-de02-4354-9aed-468571a0ff96","Type":"ContainerDied","Data":"452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106"} Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.102719 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djwt8" event={"ID":"fc4c313e-de02-4354-9aed-468571a0ff96","Type":"ContainerDied","Data":"9e6b30c1d87be14435a00cbbb119d31bdcc2c7ac9f1c96a787bac77da8a1e26b"} Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.102830 4636 scope.go:117] "RemoveContainer" containerID="452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106" Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.103285 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djwt8" Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.137852 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djwt8"] Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.140372 4636 scope.go:117] "RemoveContainer" containerID="869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41" Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.144132 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-djwt8"] Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.158002 4636 scope.go:117] "RemoveContainer" containerID="ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db" Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.179331 4636 scope.go:117] "RemoveContainer" containerID="452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106" Oct 03 14:14:37 crc kubenswrapper[4636]: E1003 14:14:37.179933 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106\": container with ID starting with 452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106 not found: ID does not exist" containerID="452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106" Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.179990 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106"} err="failed to get container status \"452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106\": rpc error: code = NotFound desc = could not find container \"452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106\": container with ID starting with 452749af1319208cc938e7121e83aae69fcd7be4b7ecefa77e7e0455ac228106 not found: ID does not exist" Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.180025 4636 scope.go:117] "RemoveContainer" containerID="869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41" Oct 03 14:14:37 crc kubenswrapper[4636]: E1003 14:14:37.180394 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41\": container with ID starting with 869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41 not found: ID does not exist" containerID="869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41" Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.180428 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41"} err="failed to get container status \"869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41\": rpc error: code = NotFound desc = could not find container \"869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41\": container with ID starting with 869c0ee6fdd29c62472d632e4d27bb7da99108afd6144eea266abcf72b749b41 not found: ID does not exist" Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.180452 4636 scope.go:117] "RemoveContainer" containerID="ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db" Oct 03 14:14:37 crc kubenswrapper[4636]: E1003 14:14:37.180715 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db\": container with ID starting with ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db not found: ID does not exist" containerID="ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db" Oct 03 14:14:37 crc kubenswrapper[4636]: I1003 14:14:37.180744 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db"} err="failed to get container status \"ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db\": rpc error: code = NotFound desc = could not find container \"ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db\": container with ID starting with ff9af1b6b880c56b287eeba12b296c04d0a53f3459de4c8bcee49d943085c5db not found: ID does not exist" Oct 03 14:14:38 crc kubenswrapper[4636]: I1003 14:14:38.111084 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtnbl" event={"ID":"993e8814-c119-4771-91ac-c0cbc352cbdc","Type":"ContainerStarted","Data":"a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf"} Oct 03 14:14:38 crc kubenswrapper[4636]: I1003 14:14:38.134685 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qtnbl" podStartSLOduration=2.690214196 podStartE2EDuration="5.134664381s" podCreationTimestamp="2025-10-03 14:14:33 +0000 UTC" firstStartedPulling="2025-10-03 14:14:35.084396077 +0000 UTC m=+824.943122324" lastFinishedPulling="2025-10-03 14:14:37.528846272 +0000 UTC m=+827.387572509" observedRunningTime="2025-10-03 14:14:38.130320261 +0000 UTC m=+827.989046508" watchObservedRunningTime="2025-10-03 14:14:38.134664381 +0000 UTC m=+827.993390628" Oct 03 14:14:38 crc kubenswrapper[4636]: I1003 14:14:38.803498 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc4c313e-de02-4354-9aed-468571a0ff96" path="/var/lib/kubelet/pods/fc4c313e-de02-4354-9aed-468571a0ff96/volumes" Oct 03 14:14:39 crc kubenswrapper[4636]: I1003 14:14:39.690005 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.243780 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.244387 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.281144 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.320987 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-bsvgc"] Oct 03 14:14:44 crc kubenswrapper[4636]: E1003 14:14:44.321524 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4c313e-de02-4354-9aed-468571a0ff96" containerName="extract-utilities" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.321598 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4c313e-de02-4354-9aed-468571a0ff96" containerName="extract-utilities" Oct 03 14:14:44 crc kubenswrapper[4636]: E1003 14:14:44.321663 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4c313e-de02-4354-9aed-468571a0ff96" containerName="registry-server" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.321715 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4c313e-de02-4354-9aed-468571a0ff96" containerName="registry-server" Oct 03 14:14:44 crc kubenswrapper[4636]: E1003 14:14:44.321799 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4c313e-de02-4354-9aed-468571a0ff96" containerName="extract-content" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.321869 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4c313e-de02-4354-9aed-468571a0ff96" containerName="extract-content" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.322035 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc4c313e-de02-4354-9aed-468571a0ff96" containerName="registry-server" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.322491 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bsvgc" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.324564 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-m8kpx" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.328935 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.335667 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.337556 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bsvgc"] Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.464243 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gxf6\" (UniqueName: \"kubernetes.io/projected/8721857c-625f-4884-bb46-55f9ce071491-kube-api-access-6gxf6\") pod \"openstack-operator-index-bsvgc\" (UID: \"8721857c-625f-4884-bb46-55f9ce071491\") " pod="openstack-operators/openstack-operator-index-bsvgc" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.566269 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gxf6\" (UniqueName: \"kubernetes.io/projected/8721857c-625f-4884-bb46-55f9ce071491-kube-api-access-6gxf6\") pod \"openstack-operator-index-bsvgc\" (UID: \"8721857c-625f-4884-bb46-55f9ce071491\") " pod="openstack-operators/openstack-operator-index-bsvgc" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.583930 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gxf6\" (UniqueName: \"kubernetes.io/projected/8721857c-625f-4884-bb46-55f9ce071491-kube-api-access-6gxf6\") pod \"openstack-operator-index-bsvgc\" (UID: \"8721857c-625f-4884-bb46-55f9ce071491\") " pod="openstack-operators/openstack-operator-index-bsvgc" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.646370 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bsvgc" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.882627 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-tzvhr" Oct 03 14:14:44 crc kubenswrapper[4636]: I1003 14:14:44.910034 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-ttj4x" Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.043509 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bsvgc"] Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.148736 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bsvgc" event={"ID":"8721857c-625f-4884-bb46-55f9ce071491","Type":"ContainerStarted","Data":"f78b37813cd5e56b6c91ee11bd5574d9fe4964dc68299eddfbe607cb434d4556"} Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.196072 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.314824 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8tf7v"] Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.315061 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8tf7v" podUID="9721832c-e1e9-49f9-96ab-253319b635bf" containerName="registry-server" containerID="cri-o://704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721" gracePeriod=2 Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.747745 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.887313 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-catalog-content\") pod \"9721832c-e1e9-49f9-96ab-253319b635bf\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.887390 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr9vx\" (UniqueName: \"kubernetes.io/projected/9721832c-e1e9-49f9-96ab-253319b635bf-kube-api-access-cr9vx\") pod \"9721832c-e1e9-49f9-96ab-253319b635bf\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.887657 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-utilities\") pod \"9721832c-e1e9-49f9-96ab-253319b635bf\" (UID: \"9721832c-e1e9-49f9-96ab-253319b635bf\") " Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.888507 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-utilities" (OuterVolumeSpecName: "utilities") pod "9721832c-e1e9-49f9-96ab-253319b635bf" (UID: "9721832c-e1e9-49f9-96ab-253319b635bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.894551 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9721832c-e1e9-49f9-96ab-253319b635bf-kube-api-access-cr9vx" (OuterVolumeSpecName: "kube-api-access-cr9vx") pod "9721832c-e1e9-49f9-96ab-253319b635bf" (UID: "9721832c-e1e9-49f9-96ab-253319b635bf"). InnerVolumeSpecName "kube-api-access-cr9vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.917185 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9721832c-e1e9-49f9-96ab-253319b635bf" (UID: "9721832c-e1e9-49f9-96ab-253319b635bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.989700 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.989740 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9721832c-e1e9-49f9-96ab-253319b635bf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:45 crc kubenswrapper[4636]: I1003 14:14:45.989778 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr9vx\" (UniqueName: \"kubernetes.io/projected/9721832c-e1e9-49f9-96ab-253319b635bf-kube-api-access-cr9vx\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.158735 4636 generic.go:334] "Generic (PLEG): container finished" podID="9721832c-e1e9-49f9-96ab-253319b635bf" containerID="704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721" exitCode=0 Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.158901 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tf7v" event={"ID":"9721832c-e1e9-49f9-96ab-253319b635bf","Type":"ContainerDied","Data":"704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721"} Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.159167 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8tf7v" event={"ID":"9721832c-e1e9-49f9-96ab-253319b635bf","Type":"ContainerDied","Data":"56ceac41575bb84a54bdad554ce3d117068bf8c70c87146c0937e8b2e71a6b0c"} Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.159198 4636 scope.go:117] "RemoveContainer" containerID="704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721" Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.158916 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8tf7v" Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.180663 4636 scope.go:117] "RemoveContainer" containerID="488c91764770f931c20336d134b8370394821a0bcb44a48cf144fd4db4b35717" Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.189978 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8tf7v"] Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.190971 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8tf7v"] Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.208135 4636 scope.go:117] "RemoveContainer" containerID="4342c7ac942e7d228d413204f585d09b639d29552d5d0911063d791fade65a37" Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.227209 4636 scope.go:117] "RemoveContainer" containerID="704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721" Oct 03 14:14:46 crc kubenswrapper[4636]: E1003 14:14:46.227625 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721\": container with ID starting with 704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721 not found: ID does not exist" containerID="704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721" Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.227713 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721"} err="failed to get container status \"704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721\": rpc error: code = NotFound desc = could not find container \"704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721\": container with ID starting with 704cfb23e6cdf8838780118bff4c6ae75b59b17b3070c0163cc41c85d55fe721 not found: ID does not exist" Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.227739 4636 scope.go:117] "RemoveContainer" containerID="488c91764770f931c20336d134b8370394821a0bcb44a48cf144fd4db4b35717" Oct 03 14:14:46 crc kubenswrapper[4636]: E1003 14:14:46.231284 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488c91764770f931c20336d134b8370394821a0bcb44a48cf144fd4db4b35717\": container with ID starting with 488c91764770f931c20336d134b8370394821a0bcb44a48cf144fd4db4b35717 not found: ID does not exist" containerID="488c91764770f931c20336d134b8370394821a0bcb44a48cf144fd4db4b35717" Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.231319 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488c91764770f931c20336d134b8370394821a0bcb44a48cf144fd4db4b35717"} err="failed to get container status \"488c91764770f931c20336d134b8370394821a0bcb44a48cf144fd4db4b35717\": rpc error: code = NotFound desc = could not find container \"488c91764770f931c20336d134b8370394821a0bcb44a48cf144fd4db4b35717\": container with ID starting with 488c91764770f931c20336d134b8370394821a0bcb44a48cf144fd4db4b35717 not found: ID does not exist" Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.231343 4636 scope.go:117] "RemoveContainer" containerID="4342c7ac942e7d228d413204f585d09b639d29552d5d0911063d791fade65a37" Oct 03 14:14:46 crc kubenswrapper[4636]: E1003 14:14:46.231755 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4342c7ac942e7d228d413204f585d09b639d29552d5d0911063d791fade65a37\": container with ID starting with 4342c7ac942e7d228d413204f585d09b639d29552d5d0911063d791fade65a37 not found: ID does not exist" containerID="4342c7ac942e7d228d413204f585d09b639d29552d5d0911063d791fade65a37" Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.231780 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4342c7ac942e7d228d413204f585d09b639d29552d5d0911063d791fade65a37"} err="failed to get container status \"4342c7ac942e7d228d413204f585d09b639d29552d5d0911063d791fade65a37\": rpc error: code = NotFound desc = could not find container \"4342c7ac942e7d228d413204f585d09b639d29552d5d0911063d791fade65a37\": container with ID starting with 4342c7ac942e7d228d413204f585d09b639d29552d5d0911063d791fade65a37 not found: ID does not exist" Oct 03 14:14:46 crc kubenswrapper[4636]: I1003 14:14:46.802427 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9721832c-e1e9-49f9-96ab-253319b635bf" path="/var/lib/kubelet/pods/9721832c-e1e9-49f9-96ab-253319b635bf/volumes" Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.112391 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qtnbl"] Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.112850 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qtnbl" podUID="993e8814-c119-4771-91ac-c0cbc352cbdc" containerName="registry-server" containerID="cri-o://a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf" gracePeriod=2 Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.178484 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bsvgc" event={"ID":"8721857c-625f-4884-bb46-55f9ce071491","Type":"ContainerStarted","Data":"0ffb445ea89c1e527cceb37b63478e4ad64f8ce631bdbf2d9a6f71c3d2cbfea1"} Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.200655 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-bsvgc" podStartSLOduration=1.392556454 podStartE2EDuration="5.200521398s" podCreationTimestamp="2025-10-03 14:14:44 +0000 UTC" firstStartedPulling="2025-10-03 14:14:45.054580611 +0000 UTC m=+834.913306858" lastFinishedPulling="2025-10-03 14:14:48.862545555 +0000 UTC m=+838.721271802" observedRunningTime="2025-10-03 14:14:49.198205409 +0000 UTC m=+839.056931656" watchObservedRunningTime="2025-10-03 14:14:49.200521398 +0000 UTC m=+839.059247635" Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.590595 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.639593 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfwss\" (UniqueName: \"kubernetes.io/projected/993e8814-c119-4771-91ac-c0cbc352cbdc-kube-api-access-bfwss\") pod \"993e8814-c119-4771-91ac-c0cbc352cbdc\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.639644 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-utilities\") pod \"993e8814-c119-4771-91ac-c0cbc352cbdc\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.639673 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-catalog-content\") pod \"993e8814-c119-4771-91ac-c0cbc352cbdc\" (UID: \"993e8814-c119-4771-91ac-c0cbc352cbdc\") " Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.640598 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-utilities" (OuterVolumeSpecName: "utilities") pod "993e8814-c119-4771-91ac-c0cbc352cbdc" (UID: "993e8814-c119-4771-91ac-c0cbc352cbdc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.674565 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/993e8814-c119-4771-91ac-c0cbc352cbdc-kube-api-access-bfwss" (OuterVolumeSpecName: "kube-api-access-bfwss") pod "993e8814-c119-4771-91ac-c0cbc352cbdc" (UID: "993e8814-c119-4771-91ac-c0cbc352cbdc"). InnerVolumeSpecName "kube-api-access-bfwss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.686620 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "993e8814-c119-4771-91ac-c0cbc352cbdc" (UID: "993e8814-c119-4771-91ac-c0cbc352cbdc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.740661 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.740709 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfwss\" (UniqueName: \"kubernetes.io/projected/993e8814-c119-4771-91ac-c0cbc352cbdc-kube-api-access-bfwss\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:49 crc kubenswrapper[4636]: I1003 14:14:49.740723 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/993e8814-c119-4771-91ac-c0cbc352cbdc-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.185360 4636 generic.go:334] "Generic (PLEG): container finished" podID="993e8814-c119-4771-91ac-c0cbc352cbdc" containerID="a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf" exitCode=0 Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.185415 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtnbl" Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.185452 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtnbl" event={"ID":"993e8814-c119-4771-91ac-c0cbc352cbdc","Type":"ContainerDied","Data":"a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf"} Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.185491 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtnbl" event={"ID":"993e8814-c119-4771-91ac-c0cbc352cbdc","Type":"ContainerDied","Data":"8124a4791eda9e78f22292aa24c0701a9aef81fdd6cbdea7e2adfb72542b0f1c"} Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.185534 4636 scope.go:117] "RemoveContainer" containerID="a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf" Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.203757 4636 scope.go:117] "RemoveContainer" containerID="f897dac717c6aaf5e3d23a9bea0c5d193b4c520bbb5f54db30e409e3cab348ae" Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.226480 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qtnbl"] Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.226839 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qtnbl"] Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.232545 4636 scope.go:117] "RemoveContainer" containerID="437a223100c587baaf37c7a11a5a7cf27ed2b2c1ec86cfa20a3db9321f0d4dbd" Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.258876 4636 scope.go:117] "RemoveContainer" containerID="a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf" Oct 03 14:14:50 crc kubenswrapper[4636]: E1003 14:14:50.259906 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf\": container with ID starting with a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf not found: ID does not exist" containerID="a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf" Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.259933 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf"} err="failed to get container status \"a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf\": rpc error: code = NotFound desc = could not find container \"a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf\": container with ID starting with a3769deae7c19c70dbfdd05b42ff9f1236f1bf8c2591fca6e21857fb7014a9bf not found: ID does not exist" Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.259955 4636 scope.go:117] "RemoveContainer" containerID="f897dac717c6aaf5e3d23a9bea0c5d193b4c520bbb5f54db30e409e3cab348ae" Oct 03 14:14:50 crc kubenswrapper[4636]: E1003 14:14:50.261082 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f897dac717c6aaf5e3d23a9bea0c5d193b4c520bbb5f54db30e409e3cab348ae\": container with ID starting with f897dac717c6aaf5e3d23a9bea0c5d193b4c520bbb5f54db30e409e3cab348ae not found: ID does not exist" containerID="f897dac717c6aaf5e3d23a9bea0c5d193b4c520bbb5f54db30e409e3cab348ae" Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.261126 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f897dac717c6aaf5e3d23a9bea0c5d193b4c520bbb5f54db30e409e3cab348ae"} err="failed to get container status \"f897dac717c6aaf5e3d23a9bea0c5d193b4c520bbb5f54db30e409e3cab348ae\": rpc error: code = NotFound desc = could not find container \"f897dac717c6aaf5e3d23a9bea0c5d193b4c520bbb5f54db30e409e3cab348ae\": container with ID starting with f897dac717c6aaf5e3d23a9bea0c5d193b4c520bbb5f54db30e409e3cab348ae not found: ID does not exist" Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.261140 4636 scope.go:117] "RemoveContainer" containerID="437a223100c587baaf37c7a11a5a7cf27ed2b2c1ec86cfa20a3db9321f0d4dbd" Oct 03 14:14:50 crc kubenswrapper[4636]: E1003 14:14:50.261961 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"437a223100c587baaf37c7a11a5a7cf27ed2b2c1ec86cfa20a3db9321f0d4dbd\": container with ID starting with 437a223100c587baaf37c7a11a5a7cf27ed2b2c1ec86cfa20a3db9321f0d4dbd not found: ID does not exist" containerID="437a223100c587baaf37c7a11a5a7cf27ed2b2c1ec86cfa20a3db9321f0d4dbd" Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.261981 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437a223100c587baaf37c7a11a5a7cf27ed2b2c1ec86cfa20a3db9321f0d4dbd"} err="failed to get container status \"437a223100c587baaf37c7a11a5a7cf27ed2b2c1ec86cfa20a3db9321f0d4dbd\": rpc error: code = NotFound desc = could not find container \"437a223100c587baaf37c7a11a5a7cf27ed2b2c1ec86cfa20a3db9321f0d4dbd\": container with ID starting with 437a223100c587baaf37c7a11a5a7cf27ed2b2c1ec86cfa20a3db9321f0d4dbd not found: ID does not exist" Oct 03 14:14:50 crc kubenswrapper[4636]: I1003 14:14:50.801522 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="993e8814-c119-4771-91ac-c0cbc352cbdc" path="/var/lib/kubelet/pods/993e8814-c119-4771-91ac-c0cbc352cbdc/volumes" Oct 03 14:14:54 crc kubenswrapper[4636]: I1003 14:14:54.646826 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-bsvgc" Oct 03 14:14:54 crc kubenswrapper[4636]: I1003 14:14:54.647193 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-bsvgc" Oct 03 14:14:54 crc kubenswrapper[4636]: I1003 14:14:54.684759 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-bsvgc" Oct 03 14:14:55 crc kubenswrapper[4636]: I1003 14:14:55.237332 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-bsvgc" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.350833 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb"] Oct 03 14:14:56 crc kubenswrapper[4636]: E1003 14:14:56.351461 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993e8814-c119-4771-91ac-c0cbc352cbdc" containerName="registry-server" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.351475 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="993e8814-c119-4771-91ac-c0cbc352cbdc" containerName="registry-server" Oct 03 14:14:56 crc kubenswrapper[4636]: E1003 14:14:56.351489 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9721832c-e1e9-49f9-96ab-253319b635bf" containerName="registry-server" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.351497 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9721832c-e1e9-49f9-96ab-253319b635bf" containerName="registry-server" Oct 03 14:14:56 crc kubenswrapper[4636]: E1003 14:14:56.351507 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993e8814-c119-4771-91ac-c0cbc352cbdc" containerName="extract-content" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.351534 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="993e8814-c119-4771-91ac-c0cbc352cbdc" containerName="extract-content" Oct 03 14:14:56 crc kubenswrapper[4636]: E1003 14:14:56.351546 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993e8814-c119-4771-91ac-c0cbc352cbdc" containerName="extract-utilities" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.351554 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="993e8814-c119-4771-91ac-c0cbc352cbdc" containerName="extract-utilities" Oct 03 14:14:56 crc kubenswrapper[4636]: E1003 14:14:56.351572 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9721832c-e1e9-49f9-96ab-253319b635bf" containerName="extract-content" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.351578 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9721832c-e1e9-49f9-96ab-253319b635bf" containerName="extract-content" Oct 03 14:14:56 crc kubenswrapper[4636]: E1003 14:14:56.351590 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9721832c-e1e9-49f9-96ab-253319b635bf" containerName="extract-utilities" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.351621 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9721832c-e1e9-49f9-96ab-253319b635bf" containerName="extract-utilities" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.351810 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="9721832c-e1e9-49f9-96ab-253319b635bf" containerName="registry-server" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.351826 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="993e8814-c119-4771-91ac-c0cbc352cbdc" containerName="registry-server" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.353075 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.356419 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tkwsn" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.365360 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb"] Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.526594 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-util\") pod \"3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.526639 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcxw\" (UniqueName: \"kubernetes.io/projected/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-kube-api-access-jrcxw\") pod \"3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.526913 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-bundle\") pod \"3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.627957 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-bundle\") pod \"3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.628032 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-util\") pod \"3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.628054 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcxw\" (UniqueName: \"kubernetes.io/projected/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-kube-api-access-jrcxw\") pod \"3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.628470 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-bundle\") pod \"3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.628633 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-util\") pod \"3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.647328 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcxw\" (UniqueName: \"kubernetes.io/projected/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-kube-api-access-jrcxw\") pod \"3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:14:56 crc kubenswrapper[4636]: I1003 14:14:56.674776 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:14:57 crc kubenswrapper[4636]: I1003 14:14:57.067054 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb"] Oct 03 14:14:57 crc kubenswrapper[4636]: I1003 14:14:57.226408 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" event={"ID":"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8","Type":"ContainerStarted","Data":"6a16b01660f321c5afc2e9d502dca4114865c692ddb1de5f2179595866f8a99d"} Oct 03 14:14:57 crc kubenswrapper[4636]: I1003 14:14:57.226470 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" event={"ID":"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8","Type":"ContainerStarted","Data":"8c1a55ae3afc377c85545c7a4e3a7e8ad949ad2cd8c7972dcc9f8e43fd2f2624"} Oct 03 14:14:58 crc kubenswrapper[4636]: I1003 14:14:58.231699 4636 generic.go:334] "Generic (PLEG): container finished" podID="b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" containerID="6a16b01660f321c5afc2e9d502dca4114865c692ddb1de5f2179595866f8a99d" exitCode=0 Oct 03 14:14:58 crc kubenswrapper[4636]: I1003 14:14:58.231745 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" event={"ID":"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8","Type":"ContainerDied","Data":"6a16b01660f321c5afc2e9d502dca4114865c692ddb1de5f2179595866f8a99d"} Oct 03 14:14:59 crc kubenswrapper[4636]: I1003 14:14:59.239019 4636 generic.go:334] "Generic (PLEG): container finished" podID="b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" containerID="99c2f1f6ef2603bcb8247cfd6fe139243316d91ed4bd6b216e5695dab6f6d1d2" exitCode=0 Oct 03 14:14:59 crc kubenswrapper[4636]: I1003 14:14:59.239078 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" event={"ID":"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8","Type":"ContainerDied","Data":"99c2f1f6ef2603bcb8247cfd6fe139243316d91ed4bd6b216e5695dab6f6d1d2"} Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.132747 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268"] Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.133791 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.137460 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.141498 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.173702 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268"] Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.247675 4636 generic.go:334] "Generic (PLEG): container finished" podID="b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" containerID="c8102e7f794bdcd955d7bfd2e2582435f7b9930063caffa8afa5073c403abac5" exitCode=0 Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.247723 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" event={"ID":"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8","Type":"ContainerDied","Data":"c8102e7f794bdcd955d7bfd2e2582435f7b9930063caffa8afa5073c403abac5"} Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.272720 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwzgn\" (UniqueName: \"kubernetes.io/projected/cc638615-ad90-437e-ad21-6b25821b92f1-kube-api-access-dwzgn\") pod \"collect-profiles-29325015-wk268\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.272843 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc638615-ad90-437e-ad21-6b25821b92f1-secret-volume\") pod \"collect-profiles-29325015-wk268\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.272874 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc638615-ad90-437e-ad21-6b25821b92f1-config-volume\") pod \"collect-profiles-29325015-wk268\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.373883 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwzgn\" (UniqueName: \"kubernetes.io/projected/cc638615-ad90-437e-ad21-6b25821b92f1-kube-api-access-dwzgn\") pod \"collect-profiles-29325015-wk268\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.373974 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc638615-ad90-437e-ad21-6b25821b92f1-secret-volume\") pod \"collect-profiles-29325015-wk268\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.374005 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc638615-ad90-437e-ad21-6b25821b92f1-config-volume\") pod \"collect-profiles-29325015-wk268\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.374859 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc638615-ad90-437e-ad21-6b25821b92f1-config-volume\") pod \"collect-profiles-29325015-wk268\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.384899 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc638615-ad90-437e-ad21-6b25821b92f1-secret-volume\") pod \"collect-profiles-29325015-wk268\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.393416 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwzgn\" (UniqueName: \"kubernetes.io/projected/cc638615-ad90-437e-ad21-6b25821b92f1-kube-api-access-dwzgn\") pod \"collect-profiles-29325015-wk268\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.452074 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:00 crc kubenswrapper[4636]: I1003 14:15:00.831950 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268"] Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.255641 4636 generic.go:334] "Generic (PLEG): container finished" podID="cc638615-ad90-437e-ad21-6b25821b92f1" containerID="01ad34bfabeabf5463339da228b48b91fe77ca5f1d4dcaa087aa15baf05d59b5" exitCode=0 Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.256233 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" event={"ID":"cc638615-ad90-437e-ad21-6b25821b92f1","Type":"ContainerDied","Data":"01ad34bfabeabf5463339da228b48b91fe77ca5f1d4dcaa087aa15baf05d59b5"} Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.256292 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" event={"ID":"cc638615-ad90-437e-ad21-6b25821b92f1","Type":"ContainerStarted","Data":"91a3d6a0c39f78e12e026c36f582bcab70d2bcf9581332d3eea8058a6a2c84f5"} Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.513348 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.691480 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrcxw\" (UniqueName: \"kubernetes.io/projected/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-kube-api-access-jrcxw\") pod \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.691778 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-util\") pod \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.691948 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-bundle\") pod \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\" (UID: \"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8\") " Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.692738 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-bundle" (OuterVolumeSpecName: "bundle") pod "b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" (UID: "b1a4227a-2f56-4bdd-b347-9d8df4ed42e8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.701657 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-kube-api-access-jrcxw" (OuterVolumeSpecName: "kube-api-access-jrcxw") pod "b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" (UID: "b1a4227a-2f56-4bdd-b347-9d8df4ed42e8"). InnerVolumeSpecName "kube-api-access-jrcxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.705294 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-util" (OuterVolumeSpecName: "util") pod "b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" (UID: "b1a4227a-2f56-4bdd-b347-9d8df4ed42e8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.795546 4636 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.795573 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrcxw\" (UniqueName: \"kubernetes.io/projected/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-kube-api-access-jrcxw\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:01 crc kubenswrapper[4636]: I1003 14:15:01.795582 4636 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1a4227a-2f56-4bdd-b347-9d8df4ed42e8-util\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.263546 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" event={"ID":"b1a4227a-2f56-4bdd-b347-9d8df4ed42e8","Type":"ContainerDied","Data":"8c1a55ae3afc377c85545c7a4e3a7e8ad949ad2cd8c7972dcc9f8e43fd2f2624"} Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.263581 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb" Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.263593 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c1a55ae3afc377c85545c7a4e3a7e8ad949ad2cd8c7972dcc9f8e43fd2f2624" Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.530495 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.709598 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwzgn\" (UniqueName: \"kubernetes.io/projected/cc638615-ad90-437e-ad21-6b25821b92f1-kube-api-access-dwzgn\") pod \"cc638615-ad90-437e-ad21-6b25821b92f1\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.709664 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc638615-ad90-437e-ad21-6b25821b92f1-secret-volume\") pod \"cc638615-ad90-437e-ad21-6b25821b92f1\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.709744 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc638615-ad90-437e-ad21-6b25821b92f1-config-volume\") pod \"cc638615-ad90-437e-ad21-6b25821b92f1\" (UID: \"cc638615-ad90-437e-ad21-6b25821b92f1\") " Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.710955 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc638615-ad90-437e-ad21-6b25821b92f1-config-volume" (OuterVolumeSpecName: "config-volume") pod "cc638615-ad90-437e-ad21-6b25821b92f1" (UID: "cc638615-ad90-437e-ad21-6b25821b92f1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.712643 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc638615-ad90-437e-ad21-6b25821b92f1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cc638615-ad90-437e-ad21-6b25821b92f1" (UID: "cc638615-ad90-437e-ad21-6b25821b92f1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.712686 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc638615-ad90-437e-ad21-6b25821b92f1-kube-api-access-dwzgn" (OuterVolumeSpecName: "kube-api-access-dwzgn") pod "cc638615-ad90-437e-ad21-6b25821b92f1" (UID: "cc638615-ad90-437e-ad21-6b25821b92f1"). InnerVolumeSpecName "kube-api-access-dwzgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.810999 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwzgn\" (UniqueName: \"kubernetes.io/projected/cc638615-ad90-437e-ad21-6b25821b92f1-kube-api-access-dwzgn\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.811275 4636 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc638615-ad90-437e-ad21-6b25821b92f1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:02 crc kubenswrapper[4636]: I1003 14:15:02.811333 4636 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc638615-ad90-437e-ad21-6b25821b92f1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:15:03 crc kubenswrapper[4636]: I1003 14:15:03.270866 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" event={"ID":"cc638615-ad90-437e-ad21-6b25821b92f1","Type":"ContainerDied","Data":"91a3d6a0c39f78e12e026c36f582bcab70d2bcf9581332d3eea8058a6a2c84f5"} Oct 03 14:15:03 crc kubenswrapper[4636]: I1003 14:15:03.270945 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268" Oct 03 14:15:03 crc kubenswrapper[4636]: I1003 14:15:03.270917 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91a3d6a0c39f78e12e026c36f582bcab70d2bcf9581332d3eea8058a6a2c84f5" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.509237 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx"] Oct 03 14:15:05 crc kubenswrapper[4636]: E1003 14:15:05.509821 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" containerName="util" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.509835 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" containerName="util" Oct 03 14:15:05 crc kubenswrapper[4636]: E1003 14:15:05.509850 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" containerName="extract" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.509857 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" containerName="extract" Oct 03 14:15:05 crc kubenswrapper[4636]: E1003 14:15:05.509874 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" containerName="pull" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.509881 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" containerName="pull" Oct 03 14:15:05 crc kubenswrapper[4636]: E1003 14:15:05.509894 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc638615-ad90-437e-ad21-6b25821b92f1" containerName="collect-profiles" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.509901 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc638615-ad90-437e-ad21-6b25821b92f1" containerName="collect-profiles" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.510042 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc638615-ad90-437e-ad21-6b25821b92f1" containerName="collect-profiles" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.510054 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a4227a-2f56-4bdd-b347-9d8df4ed42e8" containerName="extract" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.510784 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.513647 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-snwcp" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.539948 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx"] Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.646403 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55l7f\" (UniqueName: \"kubernetes.io/projected/c266feaf-9983-414a-b65e-5a13fc55c419-kube-api-access-55l7f\") pod \"openstack-operator-controller-operator-66d65dc5dc-ljjsx\" (UID: \"c266feaf-9983-414a-b65e-5a13fc55c419\") " pod="openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.747572 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55l7f\" (UniqueName: \"kubernetes.io/projected/c266feaf-9983-414a-b65e-5a13fc55c419-kube-api-access-55l7f\") pod \"openstack-operator-controller-operator-66d65dc5dc-ljjsx\" (UID: \"c266feaf-9983-414a-b65e-5a13fc55c419\") " pod="openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.767248 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55l7f\" (UniqueName: \"kubernetes.io/projected/c266feaf-9983-414a-b65e-5a13fc55c419-kube-api-access-55l7f\") pod \"openstack-operator-controller-operator-66d65dc5dc-ljjsx\" (UID: \"c266feaf-9983-414a-b65e-5a13fc55c419\") " pod="openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx" Oct 03 14:15:05 crc kubenswrapper[4636]: I1003 14:15:05.826815 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx" Oct 03 14:15:06 crc kubenswrapper[4636]: I1003 14:15:06.088156 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx"] Oct 03 14:15:06 crc kubenswrapper[4636]: I1003 14:15:06.292461 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx" event={"ID":"c266feaf-9983-414a-b65e-5a13fc55c419","Type":"ContainerStarted","Data":"a8c5ad5fc50896b3b645d1001e3a71fd6ec56c255c13772c6c6b71b2d164217a"} Oct 03 14:15:10 crc kubenswrapper[4636]: I1003 14:15:10.323811 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx" event={"ID":"c266feaf-9983-414a-b65e-5a13fc55c419","Type":"ContainerStarted","Data":"4f6857ca91a8904059e8966b579f955311cdf6e25a8606f6ea912b78215259f6"} Oct 03 14:15:13 crc kubenswrapper[4636]: I1003 14:15:13.342876 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx" event={"ID":"c266feaf-9983-414a-b65e-5a13fc55c419","Type":"ContainerStarted","Data":"b69f0aa1281d651f37ab89e076283b89470384d35ad7bc95bed51a88b7d71592"} Oct 03 14:15:13 crc kubenswrapper[4636]: I1003 14:15:13.343403 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx" Oct 03 14:15:15 crc kubenswrapper[4636]: I1003 14:15:15.828807 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx" Oct 03 14:15:15 crc kubenswrapper[4636]: I1003 14:15:15.861494 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-66d65dc5dc-ljjsx" podStartSLOduration=4.470251006 podStartE2EDuration="10.861472635s" podCreationTimestamp="2025-10-03 14:15:05 +0000 UTC" firstStartedPulling="2025-10-03 14:15:06.113458897 +0000 UTC m=+855.972185144" lastFinishedPulling="2025-10-03 14:15:12.504680516 +0000 UTC m=+862.363406773" observedRunningTime="2025-10-03 14:15:13.371407679 +0000 UTC m=+863.230133926" watchObservedRunningTime="2025-10-03 14:15:15.861472635 +0000 UTC m=+865.720198882" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.636627 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.638002 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.655010 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-2npl2" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.659067 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.665874 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.666926 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.672343 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2nrrj" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.688167 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.691153 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.692075 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.696704 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.699553 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-dxp49" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.717802 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4wdj\" (UniqueName: \"kubernetes.io/projected/12b01d5f-b89d-4bf4-bd46-387f2a7ab48f-kube-api-access-k4wdj\") pod \"barbican-operator-controller-manager-5f7c849b98-fh8lr\" (UID: \"12b01d5f-b89d-4bf4-bd46-387f2a7ab48f\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.717838 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv6js\" (UniqueName: \"kubernetes.io/projected/c8d803e5-9eca-49bf-976a-2acdfc25a727-kube-api-access-dv6js\") pod \"cinder-operator-controller-manager-7d4d4f8d-z87w6\" (UID: \"c8d803e5-9eca-49bf-976a-2acdfc25a727\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.717917 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcpk4\" (UniqueName: \"kubernetes.io/projected/8002528c-8119-4119-923c-1e15162e63f3-kube-api-access-xcpk4\") pod \"designate-operator-controller-manager-75dfd9b554-x9xms\" (UID: \"8002528c-8119-4119-923c-1e15162e63f3\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.722236 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.723132 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.729834 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6shxn" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.734001 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.767210 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.768202 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.770591 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.771566 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.775652 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-nh55c" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.776795 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-924t7" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.816360 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.816405 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.817564 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.818320 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.818517 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.823491 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.823964 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-cn7hn" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.824119 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vcp2k" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.828766 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.830811 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.836039 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtzdn\" (UniqueName: \"kubernetes.io/projected/3c207da6-bfc7-4287-aa67-56c0097f48f3-kube-api-access-dtzdn\") pod \"glance-operator-controller-manager-5568b5d68-g8m75\" (UID: \"3c207da6-bfc7-4287-aa67-56c0097f48f3\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.836118 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e000db3-2d29-4608-9a70-cfe88094a950-cert\") pod \"infra-operator-controller-manager-658588b8c9-6h5gc\" (UID: \"6e000db3-2d29-4608-9a70-cfe88094a950\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.836147 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhz6l\" (UniqueName: \"kubernetes.io/projected/6e000db3-2d29-4608-9a70-cfe88094a950-kube-api-access-nhz6l\") pod \"infra-operator-controller-manager-658588b8c9-6h5gc\" (UID: \"6e000db3-2d29-4608-9a70-cfe88094a950\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.836182 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcpk4\" (UniqueName: \"kubernetes.io/projected/8002528c-8119-4119-923c-1e15162e63f3-kube-api-access-xcpk4\") pod \"designate-operator-controller-manager-75dfd9b554-x9xms\" (UID: \"8002528c-8119-4119-923c-1e15162e63f3\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.836215 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbrgz\" (UniqueName: \"kubernetes.io/projected/24c6a469-5b37-4dc9-baed-6a3c54b11861-kube-api-access-wbrgz\") pod \"horizon-operator-controller-manager-54876c876f-dwvpt\" (UID: \"24c6a469-5b37-4dc9-baed-6a3c54b11861\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.837142 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4wdj\" (UniqueName: \"kubernetes.io/projected/12b01d5f-b89d-4bf4-bd46-387f2a7ab48f-kube-api-access-k4wdj\") pod \"barbican-operator-controller-manager-5f7c849b98-fh8lr\" (UID: \"12b01d5f-b89d-4bf4-bd46-387f2a7ab48f\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.837372 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv6js\" (UniqueName: \"kubernetes.io/projected/c8d803e5-9eca-49bf-976a-2acdfc25a727-kube-api-access-dv6js\") pod \"cinder-operator-controller-manager-7d4d4f8d-z87w6\" (UID: \"c8d803e5-9eca-49bf-976a-2acdfc25a727\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.837446 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzq4d\" (UniqueName: \"kubernetes.io/projected/eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5-kube-api-access-tzq4d\") pod \"heat-operator-controller-manager-8f58bc9db-nshvn\" (UID: \"eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.837472 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzcqm\" (UniqueName: \"kubernetes.io/projected/d9a0c033-eaea-4336-96e6-9664f726e50e-kube-api-access-mzcqm\") pod \"ironic-operator-controller-manager-699b87f775-p7d6m\" (UID: \"d9a0c033-eaea-4336-96e6-9664f726e50e\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.842178 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.864226 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.865382 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.871767 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcpk4\" (UniqueName: \"kubernetes.io/projected/8002528c-8119-4119-923c-1e15162e63f3-kube-api-access-xcpk4\") pod \"designate-operator-controller-manager-75dfd9b554-x9xms\" (UID: \"8002528c-8119-4119-923c-1e15162e63f3\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.871791 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv6js\" (UniqueName: \"kubernetes.io/projected/c8d803e5-9eca-49bf-976a-2acdfc25a727-kube-api-access-dv6js\") pod \"cinder-operator-controller-manager-7d4d4f8d-z87w6\" (UID: \"c8d803e5-9eca-49bf-976a-2acdfc25a727\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.881412 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-84hxc" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.885020 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4wdj\" (UniqueName: \"kubernetes.io/projected/12b01d5f-b89d-4bf4-bd46-387f2a7ab48f-kube-api-access-k4wdj\") pod \"barbican-operator-controller-manager-5f7c849b98-fh8lr\" (UID: \"12b01d5f-b89d-4bf4-bd46-387f2a7ab48f\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.892170 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr"] Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.919550 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.928387 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-f6mll" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.938530 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzq4d\" (UniqueName: \"kubernetes.io/projected/eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5-kube-api-access-tzq4d\") pod \"heat-operator-controller-manager-8f58bc9db-nshvn\" (UID: \"eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.938702 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzcqm\" (UniqueName: \"kubernetes.io/projected/d9a0c033-eaea-4336-96e6-9664f726e50e-kube-api-access-mzcqm\") pod \"ironic-operator-controller-manager-699b87f775-p7d6m\" (UID: \"d9a0c033-eaea-4336-96e6-9664f726e50e\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.938836 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtzdn\" (UniqueName: \"kubernetes.io/projected/3c207da6-bfc7-4287-aa67-56c0097f48f3-kube-api-access-dtzdn\") pod \"glance-operator-controller-manager-5568b5d68-g8m75\" (UID: \"3c207da6-bfc7-4287-aa67-56c0097f48f3\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.938972 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e000db3-2d29-4608-9a70-cfe88094a950-cert\") pod \"infra-operator-controller-manager-658588b8c9-6h5gc\" (UID: \"6e000db3-2d29-4608-9a70-cfe88094a950\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.939071 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhz6l\" (UniqueName: \"kubernetes.io/projected/6e000db3-2d29-4608-9a70-cfe88094a950-kube-api-access-nhz6l\") pod \"infra-operator-controller-manager-658588b8c9-6h5gc\" (UID: \"6e000db3-2d29-4608-9a70-cfe88094a950\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.939385 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbrgz\" (UniqueName: \"kubernetes.io/projected/24c6a469-5b37-4dc9-baed-6a3c54b11861-kube-api-access-wbrgz\") pod \"horizon-operator-controller-manager-54876c876f-dwvpt\" (UID: \"24c6a469-5b37-4dc9-baed-6a3c54b11861\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" Oct 03 14:15:30 crc kubenswrapper[4636]: E1003 14:15:30.940355 4636 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 14:15:30 crc kubenswrapper[4636]: E1003 14:15:30.940424 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e000db3-2d29-4608-9a70-cfe88094a950-cert podName:6e000db3-2d29-4608-9a70-cfe88094a950 nodeName:}" failed. No retries permitted until 2025-10-03 14:15:31.440404721 +0000 UTC m=+881.299131058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e000db3-2d29-4608-9a70-cfe88094a950-cert") pod "infra-operator-controller-manager-658588b8c9-6h5gc" (UID: "6e000db3-2d29-4608-9a70-cfe88094a950") : secret "infra-operator-webhook-server-cert" not found Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.956169 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.977707 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtzdn\" (UniqueName: \"kubernetes.io/projected/3c207da6-bfc7-4287-aa67-56c0097f48f3-kube-api-access-dtzdn\") pod \"glance-operator-controller-manager-5568b5d68-g8m75\" (UID: \"3c207da6-bfc7-4287-aa67-56c0097f48f3\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.981702 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbrgz\" (UniqueName: \"kubernetes.io/projected/24c6a469-5b37-4dc9-baed-6a3c54b11861-kube-api-access-wbrgz\") pod \"horizon-operator-controller-manager-54876c876f-dwvpt\" (UID: \"24c6a469-5b37-4dc9-baed-6a3c54b11861\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" Oct 03 14:15:30 crc kubenswrapper[4636]: I1003 14:15:30.981963 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:30.996771 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzcqm\" (UniqueName: \"kubernetes.io/projected/d9a0c033-eaea-4336-96e6-9664f726e50e-kube-api-access-mzcqm\") pod \"ironic-operator-controller-manager-699b87f775-p7d6m\" (UID: \"d9a0c033-eaea-4336-96e6-9664f726e50e\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.013831 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhz6l\" (UniqueName: \"kubernetes.io/projected/6e000db3-2d29-4608-9a70-cfe88094a950-kube-api-access-nhz6l\") pod \"infra-operator-controller-manager-658588b8c9-6h5gc\" (UID: \"6e000db3-2d29-4608-9a70-cfe88094a950\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.023275 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.028173 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.031682 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzq4d\" (UniqueName: \"kubernetes.io/projected/eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5-kube-api-access-tzq4d\") pod \"heat-operator-controller-manager-8f58bc9db-nshvn\" (UID: \"eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.040810 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct2xg\" (UniqueName: \"kubernetes.io/projected/dff12a21-eff6-45da-bf37-d3f0620f9c05-kube-api-access-ct2xg\") pod \"manila-operator-controller-manager-65d89cfd9f-8knqr\" (UID: \"dff12a21-eff6-45da-bf37-d3f0620f9c05\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.040894 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnpzf\" (UniqueName: \"kubernetes.io/projected/62436a9b-229c-486b-a715-6787e100d19b-kube-api-access-fnpzf\") pod \"keystone-operator-controller-manager-655d88ccb9-jcqmk\" (UID: \"62436a9b-229c-486b-a715-6787e100d19b\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.054398 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.079690 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.080636 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.087779 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jgnsl" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.095430 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.096547 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.100270 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.101339 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.105418 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.111667 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.113695 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-gfbtq" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.148163 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.148788 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.152475 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mxmdr" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.188822 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.190725 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct2xg\" (UniqueName: \"kubernetes.io/projected/dff12a21-eff6-45da-bf37-d3f0620f9c05-kube-api-access-ct2xg\") pod \"manila-operator-controller-manager-65d89cfd9f-8knqr\" (UID: \"dff12a21-eff6-45da-bf37-d3f0620f9c05\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.190800 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnpzf\" (UniqueName: \"kubernetes.io/projected/62436a9b-229c-486b-a715-6787e100d19b-kube-api-access-fnpzf\") pod \"keystone-operator-controller-manager-655d88ccb9-jcqmk\" (UID: \"62436a9b-229c-486b-a715-6787e100d19b\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.250609 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnpzf\" (UniqueName: \"kubernetes.io/projected/62436a9b-229c-486b-a715-6787e100d19b-kube-api-access-fnpzf\") pod \"keystone-operator-controller-manager-655d88ccb9-jcqmk\" (UID: \"62436a9b-229c-486b-a715-6787e100d19b\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.257442 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct2xg\" (UniqueName: \"kubernetes.io/projected/dff12a21-eff6-45da-bf37-d3f0620f9c05-kube-api-access-ct2xg\") pod \"manila-operator-controller-manager-65d89cfd9f-8knqr\" (UID: \"dff12a21-eff6-45da-bf37-d3f0620f9c05\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.264430 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.278257 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.292218 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.294189 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfjmh\" (UniqueName: \"kubernetes.io/projected/8563d341-44cb-43b4-b7a8-ba3beeac60ea-kube-api-access-wfjmh\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj\" (UID: \"8563d341-44cb-43b4-b7a8-ba3beeac60ea\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.294273 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7cq2\" (UniqueName: \"kubernetes.io/projected/89e06d08-9381-4aff-ba52-682080bd03bb-kube-api-access-h7cq2\") pod \"neutron-operator-controller-manager-8d984cc4d-9dc5p\" (UID: \"89e06d08-9381-4aff-ba52-682080bd03bb\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.294306 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xnjn\" (UniqueName: \"kubernetes.io/projected/60ec0b38-a07e-46e2-bc94-1af33d301eb6-kube-api-access-7xnjn\") pod \"nova-operator-controller-manager-7c7fc454ff-cbxrx\" (UID: \"60ec0b38-a07e-46e2-bc94-1af33d301eb6\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.295994 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.298925 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-mlqf2" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.308579 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.355339 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.395973 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vblzp\" (UniqueName: \"kubernetes.io/projected/2f612d08-a478-46d7-aefd-f31051af25d9-kube-api-access-vblzp\") pod \"octavia-operator-controller-manager-7468f855d8-48wqb\" (UID: \"2f612d08-a478-46d7-aefd-f31051af25d9\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.396050 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfjmh\" (UniqueName: \"kubernetes.io/projected/8563d341-44cb-43b4-b7a8-ba3beeac60ea-kube-api-access-wfjmh\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj\" (UID: \"8563d341-44cb-43b4-b7a8-ba3beeac60ea\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.396179 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7cq2\" (UniqueName: \"kubernetes.io/projected/89e06d08-9381-4aff-ba52-682080bd03bb-kube-api-access-h7cq2\") pod \"neutron-operator-controller-manager-8d984cc4d-9dc5p\" (UID: \"89e06d08-9381-4aff-ba52-682080bd03bb\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.396455 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xnjn\" (UniqueName: \"kubernetes.io/projected/60ec0b38-a07e-46e2-bc94-1af33d301eb6-kube-api-access-7xnjn\") pod \"nova-operator-controller-manager-7c7fc454ff-cbxrx\" (UID: \"60ec0b38-a07e-46e2-bc94-1af33d301eb6\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.418261 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xnjn\" (UniqueName: \"kubernetes.io/projected/60ec0b38-a07e-46e2-bc94-1af33d301eb6-kube-api-access-7xnjn\") pod \"nova-operator-controller-manager-7c7fc454ff-cbxrx\" (UID: \"60ec0b38-a07e-46e2-bc94-1af33d301eb6\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.433811 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.435002 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.440388 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-c44t4" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.462648 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7cq2\" (UniqueName: \"kubernetes.io/projected/89e06d08-9381-4aff-ba52-682080bd03bb-kube-api-access-h7cq2\") pod \"neutron-operator-controller-manager-8d984cc4d-9dc5p\" (UID: \"89e06d08-9381-4aff-ba52-682080bd03bb\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.462728 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfjmh\" (UniqueName: \"kubernetes.io/projected/8563d341-44cb-43b4-b7a8-ba3beeac60ea-kube-api-access-wfjmh\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj\" (UID: \"8563d341-44cb-43b4-b7a8-ba3beeac60ea\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.465963 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.468150 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.472995 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.473637 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.479388 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.480425 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.485823 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-z2w52" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.486075 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5vcl6" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.493150 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.494593 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.497750 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-ltddt" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.498448 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vblzp\" (UniqueName: \"kubernetes.io/projected/2f612d08-a478-46d7-aefd-f31051af25d9-kube-api-access-vblzp\") pod \"octavia-operator-controller-manager-7468f855d8-48wqb\" (UID: \"2f612d08-a478-46d7-aefd-f31051af25d9\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.498486 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnz5l\" (UniqueName: \"kubernetes.io/projected/f8f9f506-672a-4f93-8645-f0cd608feed0-kube-api-access-vnz5l\") pod \"ovn-operator-controller-manager-579449c7d5-8nj2c\" (UID: \"f8f9f506-672a-4f93-8645-f0cd608feed0\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.498534 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e000db3-2d29-4608-9a70-cfe88094a950-cert\") pod \"infra-operator-controller-manager-658588b8c9-6h5gc\" (UID: \"6e000db3-2d29-4608-9a70-cfe88094a950\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.498563 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pxzz\" (UniqueName: \"kubernetes.io/projected/b3cb07c2-c2b9-4421-baba-ede1bed11656-kube-api-access-4pxzz\") pod \"swift-operator-controller-manager-6859f9b676-7qrjk\" (UID: \"b3cb07c2-c2b9-4421-baba-ede1bed11656\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.498586 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bc24\" (UniqueName: \"kubernetes.io/projected/314cbc97-254d-4e64-a06f-68c7b0488c46-kube-api-access-9bc24\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj\" (UID: \"314cbc97-254d-4e64-a06f-68c7b0488c46\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.498615 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/314cbc97-254d-4e64-a06f-68c7b0488c46-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj\" (UID: \"314cbc97-254d-4e64-a06f-68c7b0488c46\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.498648 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmj5l\" (UniqueName: \"kubernetes.io/projected/ad1290bf-25e9-4766-8398-ff4811e65cad-kube-api-access-zmj5l\") pod \"placement-operator-controller-manager-54689d9f88-lkd7z\" (UID: \"ad1290bf-25e9-4766-8398-ff4811e65cad\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" Oct 03 14:15:31 crc kubenswrapper[4636]: E1003 14:15:31.498911 4636 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 14:15:31 crc kubenswrapper[4636]: E1003 14:15:31.498946 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e000db3-2d29-4608-9a70-cfe88094a950-cert podName:6e000db3-2d29-4608-9a70-cfe88094a950 nodeName:}" failed. No retries permitted until 2025-10-03 14:15:32.498932977 +0000 UTC m=+882.357659224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e000db3-2d29-4608-9a70-cfe88094a950-cert") pod "infra-operator-controller-manager-658588b8c9-6h5gc" (UID: "6e000db3-2d29-4608-9a70-cfe88094a950") : secret "infra-operator-webhook-server-cert" not found Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.522034 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.529430 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.537501 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vblzp\" (UniqueName: \"kubernetes.io/projected/2f612d08-a478-46d7-aefd-f31051af25d9-kube-api-access-vblzp\") pod \"octavia-operator-controller-manager-7468f855d8-48wqb\" (UID: \"2f612d08-a478-46d7-aefd-f31051af25d9\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.546424 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.551255 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.580420 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.598335 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.599388 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnz5l\" (UniqueName: \"kubernetes.io/projected/f8f9f506-672a-4f93-8645-f0cd608feed0-kube-api-access-vnz5l\") pod \"ovn-operator-controller-manager-579449c7d5-8nj2c\" (UID: \"f8f9f506-672a-4f93-8645-f0cd608feed0\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.599460 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pxzz\" (UniqueName: \"kubernetes.io/projected/b3cb07c2-c2b9-4421-baba-ede1bed11656-kube-api-access-4pxzz\") pod \"swift-operator-controller-manager-6859f9b676-7qrjk\" (UID: \"b3cb07c2-c2b9-4421-baba-ede1bed11656\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.599486 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bc24\" (UniqueName: \"kubernetes.io/projected/314cbc97-254d-4e64-a06f-68c7b0488c46-kube-api-access-9bc24\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj\" (UID: \"314cbc97-254d-4e64-a06f-68c7b0488c46\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.599520 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/314cbc97-254d-4e64-a06f-68c7b0488c46-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj\" (UID: \"314cbc97-254d-4e64-a06f-68c7b0488c46\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.599559 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmj5l\" (UniqueName: \"kubernetes.io/projected/ad1290bf-25e9-4766-8398-ff4811e65cad-kube-api-access-zmj5l\") pod \"placement-operator-controller-manager-54689d9f88-lkd7z\" (UID: \"ad1290bf-25e9-4766-8398-ff4811e65cad\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.599911 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" Oct 03 14:15:31 crc kubenswrapper[4636]: E1003 14:15:31.600355 4636 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 14:15:31 crc kubenswrapper[4636]: E1003 14:15:31.600648 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314cbc97-254d-4e64-a06f-68c7b0488c46-cert podName:314cbc97-254d-4e64-a06f-68c7b0488c46 nodeName:}" failed. No retries permitted until 2025-10-03 14:15:32.100633913 +0000 UTC m=+881.959360160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/314cbc97-254d-4e64-a06f-68c7b0488c46-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" (UID: "314cbc97-254d-4e64-a06f-68c7b0488c46") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.603967 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4qjw9" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.611971 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.625991 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnz5l\" (UniqueName: \"kubernetes.io/projected/f8f9f506-672a-4f93-8645-f0cd608feed0-kube-api-access-vnz5l\") pod \"ovn-operator-controller-manager-579449c7d5-8nj2c\" (UID: \"f8f9f506-672a-4f93-8645-f0cd608feed0\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.640433 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pxzz\" (UniqueName: \"kubernetes.io/projected/b3cb07c2-c2b9-4421-baba-ede1bed11656-kube-api-access-4pxzz\") pod \"swift-operator-controller-manager-6859f9b676-7qrjk\" (UID: \"b3cb07c2-c2b9-4421-baba-ede1bed11656\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.641038 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmj5l\" (UniqueName: \"kubernetes.io/projected/ad1290bf-25e9-4766-8398-ff4811e65cad-kube-api-access-zmj5l\") pod \"placement-operator-controller-manager-54689d9f88-lkd7z\" (UID: \"ad1290bf-25e9-4766-8398-ff4811e65cad\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.642707 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bc24\" (UniqueName: \"kubernetes.io/projected/314cbc97-254d-4e64-a06f-68c7b0488c46-kube-api-access-9bc24\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj\" (UID: \"314cbc97-254d-4e64-a06f-68c7b0488c46\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.647614 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.663965 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.668947 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.679817 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hw5nd" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.692605 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.715886 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.741855 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.753124 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.754807 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.759068 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-46dm8" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.766823 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.795383 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.805490 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blf8b\" (UniqueName: \"kubernetes.io/projected/126025f8-40af-4a27-a9cc-8ece19d269b0-kube-api-access-blf8b\") pod \"telemetry-operator-controller-manager-5d4d74dd89-8qdtd\" (UID: \"126025f8-40af-4a27-a9cc-8ece19d269b0\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.805584 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmpj\" (UniqueName: \"kubernetes.io/projected/58d5890d-301f-43e9-b627-40f17f79da7f-kube-api-access-tgmpj\") pod \"test-operator-controller-manager-5cd5cb47d7-lc5hb\" (UID: \"58d5890d-301f-43e9-b627-40f17f79da7f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.844678 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.846206 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.848341 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.870430 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.888010 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-txgnq" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.888503 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.903215 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6"] Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.906477 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blf8b\" (UniqueName: \"kubernetes.io/projected/126025f8-40af-4a27-a9cc-8ece19d269b0-kube-api-access-blf8b\") pod \"telemetry-operator-controller-manager-5d4d74dd89-8qdtd\" (UID: \"126025f8-40af-4a27-a9cc-8ece19d269b0\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.906556 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs2mz\" (UniqueName: \"kubernetes.io/projected/24b72852-6d98-4011-9643-5079fa6f8076-kube-api-access-gs2mz\") pod \"watcher-operator-controller-manager-6cbc6dd547-xj7dp\" (UID: \"24b72852-6d98-4011-9643-5079fa6f8076\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.906598 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmpj\" (UniqueName: \"kubernetes.io/projected/58d5890d-301f-43e9-b627-40f17f79da7f-kube-api-access-tgmpj\") pod \"test-operator-controller-manager-5cd5cb47d7-lc5hb\" (UID: \"58d5890d-301f-43e9-b627-40f17f79da7f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb" Oct 03 14:15:31 crc kubenswrapper[4636]: I1003 14:15:31.998068 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b"] Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.000680 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.008860 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a119c810-cd24-4c51-a23b-88776132f825-cert\") pod \"openstack-operator-controller-manager-6dfbbfcbb4-flhg6\" (UID: \"a119c810-cd24-4c51-a23b-88776132f825\") " pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.008982 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h9qs\" (UniqueName: \"kubernetes.io/projected/a119c810-cd24-4c51-a23b-88776132f825-kube-api-access-7h9qs\") pod \"openstack-operator-controller-manager-6dfbbfcbb4-flhg6\" (UID: \"a119c810-cd24-4c51-a23b-88776132f825\") " pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.009053 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2mz\" (UniqueName: \"kubernetes.io/projected/24b72852-6d98-4011-9643-5079fa6f8076-kube-api-access-gs2mz\") pod \"watcher-operator-controller-manager-6cbc6dd547-xj7dp\" (UID: \"24b72852-6d98-4011-9643-5079fa6f8076\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.013175 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blf8b\" (UniqueName: \"kubernetes.io/projected/126025f8-40af-4a27-a9cc-8ece19d269b0-kube-api-access-blf8b\") pod \"telemetry-operator-controller-manager-5d4d74dd89-8qdtd\" (UID: \"126025f8-40af-4a27-a9cc-8ece19d269b0\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.013642 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b"] Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.017262 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-w5795" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.018729 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmpj\" (UniqueName: \"kubernetes.io/projected/58d5890d-301f-43e9-b627-40f17f79da7f-kube-api-access-tgmpj\") pod \"test-operator-controller-manager-5cd5cb47d7-lc5hb\" (UID: \"58d5890d-301f-43e9-b627-40f17f79da7f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.046286 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs2mz\" (UniqueName: \"kubernetes.io/projected/24b72852-6d98-4011-9643-5079fa6f8076-kube-api-access-gs2mz\") pod \"watcher-operator-controller-manager-6cbc6dd547-xj7dp\" (UID: \"24b72852-6d98-4011-9643-5079fa6f8076\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.085627 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.113518 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr"] Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.125047 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6mrm\" (UniqueName: \"kubernetes.io/projected/80c4c4f6-4616-48a9-98a7-f38ebdc58514-kube-api-access-s6mrm\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b\" (UID: \"80c4c4f6-4616-48a9-98a7-f38ebdc58514\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.125134 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/314cbc97-254d-4e64-a06f-68c7b0488c46-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj\" (UID: \"314cbc97-254d-4e64-a06f-68c7b0488c46\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.125160 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a119c810-cd24-4c51-a23b-88776132f825-cert\") pod \"openstack-operator-controller-manager-6dfbbfcbb4-flhg6\" (UID: \"a119c810-cd24-4c51-a23b-88776132f825\") " pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.125233 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h9qs\" (UniqueName: \"kubernetes.io/projected/a119c810-cd24-4c51-a23b-88776132f825-kube-api-access-7h9qs\") pod \"openstack-operator-controller-manager-6dfbbfcbb4-flhg6\" (UID: \"a119c810-cd24-4c51-a23b-88776132f825\") " pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" Oct 03 14:15:32 crc kubenswrapper[4636]: E1003 14:15:32.125684 4636 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 14:15:32 crc kubenswrapper[4636]: E1003 14:15:32.125734 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314cbc97-254d-4e64-a06f-68c7b0488c46-cert podName:314cbc97-254d-4e64-a06f-68c7b0488c46 nodeName:}" failed. No retries permitted until 2025-10-03 14:15:33.125718479 +0000 UTC m=+882.984444736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/314cbc97-254d-4e64-a06f-68c7b0488c46-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" (UID: "314cbc97-254d-4e64-a06f-68c7b0488c46") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.150806 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a119c810-cd24-4c51-a23b-88776132f825-cert\") pod \"openstack-operator-controller-manager-6dfbbfcbb4-flhg6\" (UID: \"a119c810-cd24-4c51-a23b-88776132f825\") " pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.162936 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h9qs\" (UniqueName: \"kubernetes.io/projected/a119c810-cd24-4c51-a23b-88776132f825-kube-api-access-7h9qs\") pod \"openstack-operator-controller-manager-6dfbbfcbb4-flhg6\" (UID: \"a119c810-cd24-4c51-a23b-88776132f825\") " pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.167428 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.231866 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6mrm\" (UniqueName: \"kubernetes.io/projected/80c4c4f6-4616-48a9-98a7-f38ebdc58514-kube-api-access-s6mrm\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b\" (UID: \"80c4c4f6-4616-48a9-98a7-f38ebdc58514\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.272886 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.303751 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6mrm\" (UniqueName: \"kubernetes.io/projected/80c4c4f6-4616-48a9-98a7-f38ebdc58514-kube-api-access-s6mrm\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b\" (UID: \"80c4c4f6-4616-48a9-98a7-f38ebdc58514\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.310564 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.372473 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.375631 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms"] Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.483126 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr" event={"ID":"12b01d5f-b89d-4bf4-bd46-387f2a7ab48f","Type":"ContainerStarted","Data":"7747b4e7fbb817828a9735a1846bd905d506a49df17ed5628b1aef1813ef031e"} Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.491691 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms" event={"ID":"8002528c-8119-4119-923c-1e15162e63f3","Type":"ContainerStarted","Data":"bc9d59767f5375c0d1d8e08146f366e3e7d65a9af3e9616e3dc7a6b5975700ff"} Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.536821 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e000db3-2d29-4608-9a70-cfe88094a950-cert\") pod \"infra-operator-controller-manager-658588b8c9-6h5gc\" (UID: \"6e000db3-2d29-4608-9a70-cfe88094a950\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.544654 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e000db3-2d29-4608-9a70-cfe88094a950-cert\") pod \"infra-operator-controller-manager-658588b8c9-6h5gc\" (UID: \"6e000db3-2d29-4608-9a70-cfe88094a950\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.558047 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn"] Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.567734 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6"] Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.577235 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75"] Oct 03 14:15:32 crc kubenswrapper[4636]: W1003 14:15:32.585234 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8d803e5_9eca_49bf_976a_2acdfc25a727.slice/crio-b40c76d0c5917e677bc0305da2b6afdbee7e5d4030269d8cfd4b8b9584b0c1cd WatchSource:0}: Error finding container b40c76d0c5917e677bc0305da2b6afdbee7e5d4030269d8cfd4b8b9584b0c1cd: Status 404 returned error can't find the container with id b40c76d0c5917e677bc0305da2b6afdbee7e5d4030269d8cfd4b8b9584b0c1cd Oct 03 14:15:32 crc kubenswrapper[4636]: W1003 14:15:32.595839 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaba1b01_dfa6_48e4_b4f3_70a67fbfa8b5.slice/crio-76aa711e94a03f24e08bace7ca2a7b84a71d04a740113178d2aaacf774948540 WatchSource:0}: Error finding container 76aa711e94a03f24e08bace7ca2a7b84a71d04a740113178d2aaacf774948540: Status 404 returned error can't find the container with id 76aa711e94a03f24e08bace7ca2a7b84a71d04a740113178d2aaacf774948540 Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.732476 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.755077 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr"] Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.818970 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m"] Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.819011 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt"] Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.823859 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk"] Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.827092 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p"] Oct 03 14:15:32 crc kubenswrapper[4636]: I1003 14:15:32.944273 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx"] Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.096184 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb"] Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.101765 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj"] Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.107698 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c"] Oct 03 14:15:33 crc kubenswrapper[4636]: W1003 14:15:33.110340 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f9f506_672a_4f93_8645_f0cd608feed0.slice/crio-598681d9a010c64d15d878c8a3898fbfa3111e83c9cc435a480f61fc3b6ea53a WatchSource:0}: Error finding container 598681d9a010c64d15d878c8a3898fbfa3111e83c9cc435a480f61fc3b6ea53a: Status 404 returned error can't find the container with id 598681d9a010c64d15d878c8a3898fbfa3111e83c9cc435a480f61fc3b6ea53a Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.155452 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/314cbc97-254d-4e64-a06f-68c7b0488c46-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj\" (UID: \"314cbc97-254d-4e64-a06f-68c7b0488c46\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.166617 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/314cbc97-254d-4e64-a06f-68c7b0488c46-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj\" (UID: \"314cbc97-254d-4e64-a06f-68c7b0488c46\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.281253 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb"] Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.290249 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z"] Oct 03 14:15:33 crc kubenswrapper[4636]: W1003 14:15:33.297232 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f612d08_a478_46d7_aefd_f31051af25d9.slice/crio-614288c4806d7949f3a1ff1a44a0069829d9beb625dcceec7204c0e41b6b2be8 WatchSource:0}: Error finding container 614288c4806d7949f3a1ff1a44a0069829d9beb625dcceec7204c0e41b6b2be8: Status 404 returned error can't find the container with id 614288c4806d7949f3a1ff1a44a0069829d9beb625dcceec7204c0e41b6b2be8 Oct 03 14:15:33 crc kubenswrapper[4636]: W1003 14:15:33.298530 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad1290bf_25e9_4766_8398_ff4811e65cad.slice/crio-3c36f46a2e58e6bda37cdff8a40d08918a8f57e90dd231bd6f93ede1a1b65270 WatchSource:0}: Error finding container 3c36f46a2e58e6bda37cdff8a40d08918a8f57e90dd231bd6f93ede1a1b65270: Status 404 returned error can't find the container with id 3c36f46a2e58e6bda37cdff8a40d08918a8f57e90dd231bd6f93ede1a1b65270 Oct 03 14:15:33 crc kubenswrapper[4636]: E1003 14:15:33.302600 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zmj5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-54689d9f88-lkd7z_openstack-operators(ad1290bf-25e9-4766-8398-ff4811e65cad): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.321348 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.340190 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk"] Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.349499 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6"] Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.353737 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp"] Oct 03 14:15:33 crc kubenswrapper[4636]: E1003 14:15:33.357574 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4pxzz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-7qrjk_openstack-operators(b3cb07c2-c2b9-4421-baba-ede1bed11656): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:15:33 crc kubenswrapper[4636]: E1003 14:15:33.364177 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gs2mz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6cbc6dd547-xj7dp_openstack-operators(24b72852-6d98-4011-9643-5079fa6f8076): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.504582 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd"] Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.521510 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc"] Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.526275 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b"] Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.543385 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" event={"ID":"b3cb07c2-c2b9-4421-baba-ede1bed11656","Type":"ContainerStarted","Data":"6251f29759ebf4fb02003ce20c3e2360ad83266bb98542cda1dbc61ba66ebf4d"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.545071 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" event={"ID":"24b72852-6d98-4011-9643-5079fa6f8076","Type":"ContainerStarted","Data":"6b8de76371d6293a842fe2c1fbad7b3872a3107b9a67d63d1c59d21b10e1a637"} Oct 03 14:15:33 crc kubenswrapper[4636]: W1003 14:15:33.549314 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod126025f8_40af_4a27_a9cc_8ece19d269b0.slice/crio-0b0c6700e5525f9103fe842c70fa1b8455bb3bbcb71b2e276ad0e42ac39144bd WatchSource:0}: Error finding container 0b0c6700e5525f9103fe842c70fa1b8455bb3bbcb71b2e276ad0e42ac39144bd: Status 404 returned error can't find the container with id 0b0c6700e5525f9103fe842c70fa1b8455bb3bbcb71b2e276ad0e42ac39144bd Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.555438 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb" event={"ID":"58d5890d-301f-43e9-b627-40f17f79da7f","Type":"ContainerStarted","Data":"afbeccd9ea620969a505a0fc587f89bfca349f75f20e9daec692c81841ed41cf"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.561385 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" event={"ID":"3c207da6-bfc7-4287-aa67-56c0097f48f3","Type":"ContainerStarted","Data":"b02092dc8b53254ef38c5f3b0df774dbbc8f7875775e18b3ef20c558f8724f00"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.564044 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" event={"ID":"a119c810-cd24-4c51-a23b-88776132f825","Type":"ContainerStarted","Data":"3a0d49ffc4982762475912bcfc26811cad06f204023933906af755e801d91c87"} Oct 03 14:15:33 crc kubenswrapper[4636]: E1003 14:15:33.567751 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" podUID="ad1290bf-25e9-4766-8398-ff4811e65cad" Oct 03 14:15:33 crc kubenswrapper[4636]: E1003 14:15:33.568383 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s6mrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b_openstack-operators(80c4c4f6-4616-48a9-98a7-f38ebdc58514): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.568606 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" event={"ID":"24c6a469-5b37-4dc9-baed-6a3c54b11861","Type":"ContainerStarted","Data":"cc611c38f01740b9ed4bd728656e2393dccae09f5ecced6fd20092d6314f0fe8"} Oct 03 14:15:33 crc kubenswrapper[4636]: E1003 14:15:33.572650 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b" podUID="80c4c4f6-4616-48a9-98a7-f38ebdc58514" Oct 03 14:15:33 crc kubenswrapper[4636]: E1003 14:15:33.585175 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nhz6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-658588b8c9-6h5gc_openstack-operators(6e000db3-2d29-4608-9a70-cfe88094a950): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.590933 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" event={"ID":"ad1290bf-25e9-4766-8398-ff4811e65cad","Type":"ContainerStarted","Data":"3c36f46a2e58e6bda37cdff8a40d08918a8f57e90dd231bd6f93ede1a1b65270"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.592379 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk" event={"ID":"62436a9b-229c-486b-a715-6787e100d19b","Type":"ContainerStarted","Data":"3eab1d89b061a41b23be60b30b2eea71bb4bc5f5d7b0013e2591896bcbd0daf0"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.593502 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn" event={"ID":"eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5","Type":"ContainerStarted","Data":"76aa711e94a03f24e08bace7ca2a7b84a71d04a740113178d2aaacf774948540"} Oct 03 14:15:33 crc kubenswrapper[4636]: E1003 14:15:33.594779 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" podUID="ad1290bf-25e9-4766-8398-ff4811e65cad" Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.597355 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" event={"ID":"c8d803e5-9eca-49bf-976a-2acdfc25a727","Type":"ContainerStarted","Data":"b40c76d0c5917e677bc0305da2b6afdbee7e5d4030269d8cfd4b8b9584b0c1cd"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.638120 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c" event={"ID":"f8f9f506-672a-4f93-8645-f0cd608feed0","Type":"ContainerStarted","Data":"598681d9a010c64d15d878c8a3898fbfa3111e83c9cc435a480f61fc3b6ea53a"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.671540 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" event={"ID":"2f612d08-a478-46d7-aefd-f31051af25d9","Type":"ContainerStarted","Data":"614288c4806d7949f3a1ff1a44a0069829d9beb625dcceec7204c0e41b6b2be8"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.676018 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" event={"ID":"d9a0c033-eaea-4336-96e6-9664f726e50e","Type":"ContainerStarted","Data":"40a897001ba9636a11ce483e2d8336d44d1db91a19f7857dcf3ae6c40bc496aa"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.679494 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr" event={"ID":"dff12a21-eff6-45da-bf37-d3f0620f9c05","Type":"ContainerStarted","Data":"cd5264974550c84f11e2daed11a08372ea89086a7ec2a38d3665669dbe5eefcb"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.683345 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj" event={"ID":"8563d341-44cb-43b4-b7a8-ba3beeac60ea","Type":"ContainerStarted","Data":"253cc08b62be497560206e1ea9c738a27495cf25faf29e598bea5afb1d2c28a8"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.686724 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx" event={"ID":"60ec0b38-a07e-46e2-bc94-1af33d301eb6","Type":"ContainerStarted","Data":"04887ba924af8ed0439d2b76b45ee394296b209255ea6e1a8c457bd0316af6d5"} Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.689742 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" event={"ID":"89e06d08-9381-4aff-ba52-682080bd03bb","Type":"ContainerStarted","Data":"36fd5d1bff62ee0968bdc3c323e1d7c917a9dc66a7ebe6a2a7cb0ae14eebfb64"} Oct 03 14:15:33 crc kubenswrapper[4636]: E1003 14:15:33.703684 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" podUID="24b72852-6d98-4011-9643-5079fa6f8076" Oct 03 14:15:33 crc kubenswrapper[4636]: E1003 14:15:33.758575 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" podUID="b3cb07c2-c2b9-4421-baba-ede1bed11656" Oct 03 14:15:33 crc kubenswrapper[4636]: I1003 14:15:33.938666 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj"] Oct 03 14:15:34 crc kubenswrapper[4636]: E1003 14:15:34.240679 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" podUID="6e000db3-2d29-4608-9a70-cfe88094a950" Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.721997 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" event={"ID":"6e000db3-2d29-4608-9a70-cfe88094a950","Type":"ContainerStarted","Data":"37f503be12bfab274507758874dc86634415294fec1ec6863848948b2df94c76"} Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.722298 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" event={"ID":"6e000db3-2d29-4608-9a70-cfe88094a950","Type":"ContainerStarted","Data":"0dcfdd1dfb0b621bc630329c3b7148c4406de7744034523ea946cdd6ee655bbd"} Oct 03 14:15:34 crc kubenswrapper[4636]: E1003 14:15:34.729494 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" podUID="6e000db3-2d29-4608-9a70-cfe88094a950" Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.748183 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" event={"ID":"ad1290bf-25e9-4766-8398-ff4811e65cad","Type":"ContainerStarted","Data":"6845735d2a706f4932a8c68263a34afe08342b1907d46975099ee14e2e3fae55"} Oct 03 14:15:34 crc kubenswrapper[4636]: E1003 14:15:34.754361 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" podUID="ad1290bf-25e9-4766-8398-ff4811e65cad" Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.766403 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" event={"ID":"314cbc97-254d-4e64-a06f-68c7b0488c46","Type":"ContainerStarted","Data":"17b70a2a88ea69ce9706fa5838762a40178f17c6e82ac4dcca23c2f67340476c"} Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.776786 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" event={"ID":"b3cb07c2-c2b9-4421-baba-ede1bed11656","Type":"ContainerStarted","Data":"1fb7b9b4a54b5499ab91d0539678a94b64b6e19c7e4487ece90f388ee710278e"} Oct 03 14:15:34 crc kubenswrapper[4636]: E1003 14:15:34.780164 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" podUID="b3cb07c2-c2b9-4421-baba-ede1bed11656" Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.817534 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" event={"ID":"126025f8-40af-4a27-a9cc-8ece19d269b0","Type":"ContainerStarted","Data":"0b0c6700e5525f9103fe842c70fa1b8455bb3bbcb71b2e276ad0e42ac39144bd"} Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.842244 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" event={"ID":"24b72852-6d98-4011-9643-5079fa6f8076","Type":"ContainerStarted","Data":"1e3040e4edf2a74e56208866a69729e9a778ec352760444da548f74107daf655"} Oct 03 14:15:34 crc kubenswrapper[4636]: E1003 14:15:34.847133 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" podUID="24b72852-6d98-4011-9643-5079fa6f8076" Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.854335 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" event={"ID":"a119c810-cd24-4c51-a23b-88776132f825","Type":"ContainerStarted","Data":"32e1146e65b35147450c4824ce8cd0a1d948aaeea8635dd827de40155645f785"} Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.854374 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" event={"ID":"a119c810-cd24-4c51-a23b-88776132f825","Type":"ContainerStarted","Data":"6b3f3bfa815ab4260647cb80161cc732f888cb85757e992ac307282cef96f5cc"} Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.854639 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.870997 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b" event={"ID":"80c4c4f6-4616-48a9-98a7-f38ebdc58514","Type":"ContainerStarted","Data":"7677f7dfe4db3837d13aa932146b8c0b72cbcea867b12abd5b5e4b8393db99d9"} Oct 03 14:15:34 crc kubenswrapper[4636]: E1003 14:15:34.872865 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b" podUID="80c4c4f6-4616-48a9-98a7-f38ebdc58514" Oct 03 14:15:34 crc kubenswrapper[4636]: I1003 14:15:34.897655 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" podStartSLOduration=3.897620575 podStartE2EDuration="3.897620575s" podCreationTimestamp="2025-10-03 14:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:15:34.894352461 +0000 UTC m=+884.753078718" watchObservedRunningTime="2025-10-03 14:15:34.897620575 +0000 UTC m=+884.756346822" Oct 03 14:15:35 crc kubenswrapper[4636]: E1003 14:15:35.886013 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" podUID="b3cb07c2-c2b9-4421-baba-ede1bed11656" Oct 03 14:15:35 crc kubenswrapper[4636]: E1003 14:15:35.886129 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" podUID="ad1290bf-25e9-4766-8398-ff4811e65cad" Oct 03 14:15:35 crc kubenswrapper[4636]: E1003 14:15:35.886512 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" podUID="6e000db3-2d29-4608-9a70-cfe88094a950" Oct 03 14:15:35 crc kubenswrapper[4636]: E1003 14:15:35.886548 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b" podUID="80c4c4f6-4616-48a9-98a7-f38ebdc58514" Oct 03 14:15:35 crc kubenswrapper[4636]: E1003 14:15:35.888535 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" podUID="24b72852-6d98-4011-9643-5079fa6f8076" Oct 03 14:15:42 crc kubenswrapper[4636]: I1003 14:15:42.286282 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6dfbbfcbb4-flhg6" Oct 03 14:15:48 crc kubenswrapper[4636]: E1003 14:15:48.046181 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830" Oct 03 14:15:48 crc kubenswrapper[4636]: E1003 14:15:48.046616 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dv6js,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-7d4d4f8d-z87w6_openstack-operators(c8d803e5-9eca-49bf-976a-2acdfc25a727): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:15:48 crc kubenswrapper[4636]: E1003 14:15:48.578428 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e" Oct 03 14:15:48 crc kubenswrapper[4636]: E1003 14:15:48.578613 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-blf8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5d4d74dd89-8qdtd_openstack-operators(126025f8-40af-4a27-a9cc-8ece19d269b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:15:48 crc kubenswrapper[4636]: E1003 14:15:48.938291 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:354a1057bb423082aeda16c0209381a05266e90e30e216522c1462be7d4c4610" Oct 03 14:15:48 crc kubenswrapper[4636]: E1003 14:15:48.938726 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:354a1057bb423082aeda16c0209381a05266e90e30e216522c1462be7d4c4610,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dtzdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5568b5d68-g8m75_openstack-operators(3c207da6-bfc7-4287-aa67-56c0097f48f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:15:50 crc kubenswrapper[4636]: E1003 14:15:50.980694 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:4cba007c18be1ec9aac2ece7a5ce6444a94afd89f0fb032522811d5bdf5bee73" Oct 03 14:15:50 crc kubenswrapper[4636]: E1003 14:15:50.982478 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:4cba007c18be1ec9aac2ece7a5ce6444a94afd89f0fb032522811d5bdf5bee73,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wbrgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-54876c876f-dwvpt_openstack-operators(24c6a469-5b37-4dc9-baed-6a3c54b11861): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:15:52 crc kubenswrapper[4636]: E1003 14:15:52.884991 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862" Oct 03 14:15:52 crc kubenswrapper[4636]: E1003 14:15:52.885193 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h7cq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-8d984cc4d-9dc5p_openstack-operators(89e06d08-9381-4aff-ba52-682080bd03bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:15:54 crc kubenswrapper[4636]: E1003 14:15:54.318861 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182" Oct 03 14:15:54 crc kubenswrapper[4636]: E1003 14:15:54.319391 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vblzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7468f855d8-48wqb_openstack-operators(2f612d08-a478-46d7-aefd-f31051af25d9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:15:54 crc kubenswrapper[4636]: E1003 14:15:54.791823 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:c438734cc669f60ba9d4692fab478cbd326c7de2539d482a21de54a1384ad7ac" Oct 03 14:15:54 crc kubenswrapper[4636]: E1003 14:15:54.792069 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:c438734cc669f60ba9d4692fab478cbd326c7de2539d482a21de54a1384ad7ac,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mzcqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-699b87f775-p7d6m_openstack-operators(d9a0c033-eaea-4336-96e6-9664f726e50e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:15:56 crc kubenswrapper[4636]: E1003 14:15:56.942945 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" podUID="89e06d08-9381-4aff-ba52-682080bd03bb" Oct 03 14:15:56 crc kubenswrapper[4636]: E1003 14:15:56.971877 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" podUID="126025f8-40af-4a27-a9cc-8ece19d269b0" Oct 03 14:15:56 crc kubenswrapper[4636]: E1003 14:15:56.981204 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" podUID="d9a0c033-eaea-4336-96e6-9664f726e50e" Oct 03 14:15:57 crc kubenswrapper[4636]: I1003 14:15:57.009464 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" event={"ID":"89e06d08-9381-4aff-ba52-682080bd03bb","Type":"ContainerStarted","Data":"342044646ac92e3eb81e289be29bb91bd65ce05b07a709f50f0839c0975ee462"} Oct 03 14:15:57 crc kubenswrapper[4636]: E1003 14:15:57.010778 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" podUID="89e06d08-9381-4aff-ba52-682080bd03bb" Oct 03 14:15:57 crc kubenswrapper[4636]: I1003 14:15:57.011206 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" event={"ID":"d9a0c033-eaea-4336-96e6-9664f726e50e","Type":"ContainerStarted","Data":"0392bc332b1b7e94025648fdb57cc08786313d7c8343a1a5842a53decc35b015"} Oct 03 14:15:57 crc kubenswrapper[4636]: E1003 14:15:57.012231 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:c438734cc669f60ba9d4692fab478cbd326c7de2539d482a21de54a1384ad7ac\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" podUID="d9a0c033-eaea-4336-96e6-9664f726e50e" Oct 03 14:15:57 crc kubenswrapper[4636]: I1003 14:15:57.012921 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" event={"ID":"126025f8-40af-4a27-a9cc-8ece19d269b0","Type":"ContainerStarted","Data":"3844bbde6725554a7269b6e771df10c198e31c287814b8e18ea211eb2853c596"} Oct 03 14:15:57 crc kubenswrapper[4636]: E1003 14:15:57.016347 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" podUID="126025f8-40af-4a27-a9cc-8ece19d269b0" Oct 03 14:15:57 crc kubenswrapper[4636]: E1003 14:15:57.053312 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" podUID="2f612d08-a478-46d7-aefd-f31051af25d9" Oct 03 14:15:57 crc kubenswrapper[4636]: E1003 14:15:57.109119 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" podUID="3c207da6-bfc7-4287-aa67-56c0097f48f3" Oct 03 14:15:57 crc kubenswrapper[4636]: E1003 14:15:57.216579 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" podUID="24c6a469-5b37-4dc9-baed-6a3c54b11861" Oct 03 14:15:57 crc kubenswrapper[4636]: E1003 14:15:57.244811 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" podUID="c8d803e5-9eca-49bf-976a-2acdfc25a727" Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.019561 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" event={"ID":"b3cb07c2-c2b9-4421-baba-ede1bed11656","Type":"ContainerStarted","Data":"7d9211acf2206f2a5ad0a8e0f84ef891735679f5b26166cc07422e6f95baf9ab"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.021575 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" event={"ID":"24c6a469-5b37-4dc9-baed-6a3c54b11861","Type":"ContainerStarted","Data":"c1f9eeaed9fe0bca6c78335fa83c8b91a9c5ef5a59d3c36cf0450cb1fdae0faf"} Oct 03 14:15:58 crc kubenswrapper[4636]: E1003 14:15:58.023036 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:4cba007c18be1ec9aac2ece7a5ce6444a94afd89f0fb032522811d5bdf5bee73\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" podUID="24c6a469-5b37-4dc9-baed-6a3c54b11861" Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.024241 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b" event={"ID":"80c4c4f6-4616-48a9-98a7-f38ebdc58514","Type":"ContainerStarted","Data":"ee475a7ffeed313ce88c42cd4730aa44553f3c55d8e0c08b64e946aee3ed3a4e"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.025770 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c" event={"ID":"f8f9f506-672a-4f93-8645-f0cd608feed0","Type":"ContainerStarted","Data":"26174fc0815a91a28085795eeb44bcf5f7ba7781d5865e066bd00f983e243e71"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.026936 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj" event={"ID":"8563d341-44cb-43b4-b7a8-ba3beeac60ea","Type":"ContainerStarted","Data":"26cc4f23edf7a94af55785776ad690b45a9deb19795b18392ade98d2584af57f"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.028474 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr" event={"ID":"12b01d5f-b89d-4bf4-bd46-387f2a7ab48f","Type":"ContainerStarted","Data":"4e94b3d47b23c60563544de25f27bf7f2d58572a44b8981de5310a508e79bfea"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.029611 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms" event={"ID":"8002528c-8119-4119-923c-1e15162e63f3","Type":"ContainerStarted","Data":"4524c91e66e3c7ab239ecd06ce264b3372fa2f8d81c115eba7f17418a73fb9bf"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.030627 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" event={"ID":"3c207da6-bfc7-4287-aa67-56c0097f48f3","Type":"ContainerStarted","Data":"c494ebfa06fa8465db0890e714443dc19e16a6d9fef8fdbb86404dbe6f5486f4"} Oct 03 14:15:58 crc kubenswrapper[4636]: E1003 14:15:58.032467 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:354a1057bb423082aeda16c0209381a05266e90e30e216522c1462be7d4c4610\\\"\"" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" podUID="3c207da6-bfc7-4287-aa67-56c0097f48f3" Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.033556 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb" event={"ID":"58d5890d-301f-43e9-b627-40f17f79da7f","Type":"ContainerStarted","Data":"a066844f1aef9a75822ee17533ca197216b15eda477ba17d2b4895ce9c24a55f"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.035980 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" event={"ID":"314cbc97-254d-4e64-a06f-68c7b0488c46","Type":"ContainerStarted","Data":"f85b11b822c9f6bc3d21c4ce2cb5c0a5061d4899db7c887c80dcfe5ad999da42"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.037467 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr" event={"ID":"dff12a21-eff6-45da-bf37-d3f0620f9c05","Type":"ContainerStarted","Data":"edfdf74402686e0d0139e25f9c9d388b629fd2dd0d443dd27fd1a507177d7dd7"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.038902 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn" event={"ID":"eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5","Type":"ContainerStarted","Data":"73ed88dce530d2c0a1ddadf7077e4f5eca68cdf0695b7252046a651f9167cc9d"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.038933 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn" event={"ID":"eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5","Type":"ContainerStarted","Data":"4f0449810a58e5e4c044b645d3370969a15d46da096580f552367b55cad3dfd8"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.040471 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" event={"ID":"24b72852-6d98-4011-9643-5079fa6f8076","Type":"ContainerStarted","Data":"85c3694e6fa24e8d090f88584c248f39629402dfb142f15299e84f7b10c65903"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.041778 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" event={"ID":"2f612d08-a478-46d7-aefd-f31051af25d9","Type":"ContainerStarted","Data":"fdd9461bdcc2de4edd6c1152604fde87a6959421af4aec3104be6dab1a201677"} Oct 03 14:15:58 crc kubenswrapper[4636]: E1003 14:15:58.042903 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" podUID="2f612d08-a478-46d7-aefd-f31051af25d9" Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.044195 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" event={"ID":"c8d803e5-9eca-49bf-976a-2acdfc25a727","Type":"ContainerStarted","Data":"b15cd5c71a13125e0f7c8db0af02a005a2bcaf3b982a7819979068608443155d"} Oct 03 14:15:58 crc kubenswrapper[4636]: E1003 14:15:58.045014 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" podUID="c8d803e5-9eca-49bf-976a-2acdfc25a727" Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.047155 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" event={"ID":"6e000db3-2d29-4608-9a70-cfe88094a950","Type":"ContainerStarted","Data":"8633a37e50865145bbbd7ca5bdbf8747d429fd1cab3ac10b0ac4eccb25582e18"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.047395 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.049384 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" event={"ID":"ad1290bf-25e9-4766-8398-ff4811e65cad","Type":"ContainerStarted","Data":"45844a42f932e53211aaec501be151718f651248d621edcdf54f2734e156818f"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.050840 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk" event={"ID":"62436a9b-229c-486b-a715-6787e100d19b","Type":"ContainerStarted","Data":"b93c20a1642df8b6d7f05998ecce6b9b3d09884012b7769f7278be38ecded0ee"} Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.065162 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx" event={"ID":"60ec0b38-a07e-46e2-bc94-1af33d301eb6","Type":"ContainerStarted","Data":"751d7396f8b8d48c31e2e081a6676feaa76c6415a2fcebe08444f793f42e546a"} Oct 03 14:15:58 crc kubenswrapper[4636]: E1003 14:15:58.069162 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" podUID="126025f8-40af-4a27-a9cc-8ece19d269b0" Oct 03 14:15:58 crc kubenswrapper[4636]: E1003 14:15:58.069262 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:c438734cc669f60ba9d4692fab478cbd326c7de2539d482a21de54a1384ad7ac\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" podUID="d9a0c033-eaea-4336-96e6-9664f726e50e" Oct 03 14:15:58 crc kubenswrapper[4636]: E1003 14:15:58.069371 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" podUID="89e06d08-9381-4aff-ba52-682080bd03bb" Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.211119 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b" podStartSLOduration=5.466574389 podStartE2EDuration="27.211090452s" podCreationTimestamp="2025-10-03 14:15:31 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.568228611 +0000 UTC m=+883.426954868" lastFinishedPulling="2025-10-03 14:15:55.312744684 +0000 UTC m=+905.171470931" observedRunningTime="2025-10-03 14:15:58.156406856 +0000 UTC m=+908.015133123" watchObservedRunningTime="2025-10-03 14:15:58.211090452 +0000 UTC m=+908.069816699" Oct 03 14:15:58 crc kubenswrapper[4636]: I1003 14:15:58.485977 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" podStartSLOduration=5.415702819 podStartE2EDuration="28.485959392s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.584985462 +0000 UTC m=+883.443711709" lastFinishedPulling="2025-10-03 14:15:56.655242035 +0000 UTC m=+906.513968282" observedRunningTime="2025-10-03 14:15:58.433283307 +0000 UTC m=+908.292009554" watchObservedRunningTime="2025-10-03 14:15:58.485959392 +0000 UTC m=+908.344685639" Oct 03 14:15:59 crc kubenswrapper[4636]: E1003 14:15:59.072297 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:4cba007c18be1ec9aac2ece7a5ce6444a94afd89f0fb032522811d5bdf5bee73\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" podUID="24c6a469-5b37-4dc9-baed-6a3c54b11861" Oct 03 14:15:59 crc kubenswrapper[4636]: E1003 14:15:59.072756 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" podUID="2f612d08-a478-46d7-aefd-f31051af25d9" Oct 03 14:16:00 crc kubenswrapper[4636]: I1003 14:16:00.076935 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" event={"ID":"314cbc97-254d-4e64-a06f-68c7b0488c46","Type":"ContainerStarted","Data":"577a8c5f1fcab5c804b8e36ef9e64f796775a88b8f73d74b1922c8520f2d75b2"} Oct 03 14:16:00 crc kubenswrapper[4636]: I1003 14:16:00.079389 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx" event={"ID":"60ec0b38-a07e-46e2-bc94-1af33d301eb6","Type":"ContainerStarted","Data":"09192f150f5b923bd55f5e30f01695c007dd99ce5c724eba17c830c878de298f"} Oct 03 14:16:00 crc kubenswrapper[4636]: I1003 14:16:00.079742 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" Oct 03 14:16:00 crc kubenswrapper[4636]: I1003 14:16:00.099881 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" podStartSLOduration=5.802801478 podStartE2EDuration="29.099859894s" podCreationTimestamp="2025-10-03 14:15:31 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.356902936 +0000 UTC m=+883.215629183" lastFinishedPulling="2025-10-03 14:15:56.653961352 +0000 UTC m=+906.512687599" observedRunningTime="2025-10-03 14:16:00.094131937 +0000 UTC m=+909.952858254" watchObservedRunningTime="2025-10-03 14:16:00.099859894 +0000 UTC m=+909.958586141" Oct 03 14:16:00 crc kubenswrapper[4636]: I1003 14:16:00.108975 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" podStartSLOduration=5.757167584 podStartE2EDuration="29.108953918s" podCreationTimestamp="2025-10-03 14:15:31 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.302442995 +0000 UTC m=+883.161169242" lastFinishedPulling="2025-10-03 14:15:56.654229329 +0000 UTC m=+906.512955576" observedRunningTime="2025-10-03 14:16:00.107154472 +0000 UTC m=+909.965880729" watchObservedRunningTime="2025-10-03 14:16:00.108953918 +0000 UTC m=+909.967680175" Oct 03 14:16:00 crc kubenswrapper[4636]: I1003 14:16:00.138280 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" podStartSLOduration=5.848026771 podStartE2EDuration="29.138263122s" podCreationTimestamp="2025-10-03 14:15:31 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.363985528 +0000 UTC m=+883.222711765" lastFinishedPulling="2025-10-03 14:15:56.654221869 +0000 UTC m=+906.512948116" observedRunningTime="2025-10-03 14:16:00.13741206 +0000 UTC m=+909.996138337" watchObservedRunningTime="2025-10-03 14:16:00.138263122 +0000 UTC m=+909.996989369" Oct 03 14:16:00 crc kubenswrapper[4636]: I1003 14:16:00.159012 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn" podStartSLOduration=7.487359547 podStartE2EDuration="30.158991255s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:32.626679115 +0000 UTC m=+882.485405362" lastFinishedPulling="2025-10-03 14:15:55.298310823 +0000 UTC m=+905.157037070" observedRunningTime="2025-10-03 14:16:00.155065004 +0000 UTC m=+910.013791271" watchObservedRunningTime="2025-10-03 14:16:00.158991255 +0000 UTC m=+910.017717502" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.090714 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj" event={"ID":"8563d341-44cb-43b4-b7a8-ba3beeac60ea","Type":"ContainerStarted","Data":"1040257bd527d582c408cf012070db800fedd2d575517144ad7af942a77d464b"} Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.091787 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.096589 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr" event={"ID":"12b01d5f-b89d-4bf4-bd46-387f2a7ab48f","Type":"ContainerStarted","Data":"dcf60045ac19b6f84c09c757620d82b92985a58638944b8ff66fc40f4deed6d5"} Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.097093 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.102205 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.103593 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb" event={"ID":"58d5890d-301f-43e9-b627-40f17f79da7f","Type":"ContainerStarted","Data":"187edd15576053d2d63df60fc887703218efb7208148fea937c79766c22aff70"} Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.103836 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.105867 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c" event={"ID":"f8f9f506-672a-4f93-8645-f0cd608feed0","Type":"ContainerStarted","Data":"a19d096f03b4419cbc27a23917bdae2de99bf556079ec206a3337b4d70acadef"} Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.106160 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.108350 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr" event={"ID":"dff12a21-eff6-45da-bf37-d3f0620f9c05","Type":"ContainerStarted","Data":"d29f05b79b205dcdab3f63ae712bd685ec8c0a6a56eac6cb08a2c197a8311d11"} Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.108841 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.115743 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj" podStartSLOduration=8.935987017 podStartE2EDuration="31.115724684s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.120039633 +0000 UTC m=+882.978765880" lastFinishedPulling="2025-10-03 14:15:55.2997773 +0000 UTC m=+905.158503547" observedRunningTime="2025-10-03 14:16:01.11442438 +0000 UTC m=+910.973150657" watchObservedRunningTime="2025-10-03 14:16:01.115724684 +0000 UTC m=+910.974450931" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.115924 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms" event={"ID":"8002528c-8119-4119-923c-1e15162e63f3","Type":"ContainerStarted","Data":"a90c6ccb0b83711e7305af62f9fee5f7f7c8d377c94fbfdcb49a78f2adf433db"} Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.116040 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.126385 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" event={"ID":"3c207da6-bfc7-4287-aa67-56c0097f48f3","Type":"ContainerStarted","Data":"979c89b8d52518a779c5f68f419c5bd3eb1e3c428e3312d0fb32dd0c3cb804a2"} Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.126588 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.133747 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" event={"ID":"c8d803e5-9eca-49bf-976a-2acdfc25a727","Type":"ContainerStarted","Data":"7855cad70bd62a3e687eee2630cb5c7c50a3bc92bc8eed1904604fef774c0c9b"} Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.134361 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.136974 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk" event={"ID":"62436a9b-229c-486b-a715-6787e100d19b","Type":"ContainerStarted","Data":"b83c8a9f66a1578d09251f8162d4df280c4ef202413bc6ea1790d7d61be57a7e"} Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.137000 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.137342 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.137824 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.145893 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb" podStartSLOduration=7.956494854 podStartE2EDuration="30.145877249s" podCreationTimestamp="2025-10-03 14:15:31 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.110995341 +0000 UTC m=+882.969721588" lastFinishedPulling="2025-10-03 14:15:55.300377736 +0000 UTC m=+905.159103983" observedRunningTime="2025-10-03 14:16:01.142741168 +0000 UTC m=+911.001467435" watchObservedRunningTime="2025-10-03 14:16:01.145877249 +0000 UTC m=+911.004603496" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.178881 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c" podStartSLOduration=7.9965769049999995 podStartE2EDuration="30.178864288s" podCreationTimestamp="2025-10-03 14:15:31 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.117471317 +0000 UTC m=+882.976197564" lastFinishedPulling="2025-10-03 14:15:55.2997587 +0000 UTC m=+905.158484947" observedRunningTime="2025-10-03 14:16:01.16769752 +0000 UTC m=+911.026423777" watchObservedRunningTime="2025-10-03 14:16:01.178864288 +0000 UTC m=+911.037590535" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.185296 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr" podStartSLOduration=7.955094978 podStartE2EDuration="31.185279833s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:32.068119928 +0000 UTC m=+881.926846175" lastFinishedPulling="2025-10-03 14:15:55.298304783 +0000 UTC m=+905.157031030" observedRunningTime="2025-10-03 14:16:01.181813353 +0000 UTC m=+911.040539620" watchObservedRunningTime="2025-10-03 14:16:01.185279833 +0000 UTC m=+911.044006080" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.208276 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr" podStartSLOduration=8.699489465 podStartE2EDuration="31.208255274s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:32.786562397 +0000 UTC m=+882.645288644" lastFinishedPulling="2025-10-03 14:15:55.295328206 +0000 UTC m=+905.154054453" observedRunningTime="2025-10-03 14:16:01.197690352 +0000 UTC m=+911.056416609" watchObservedRunningTime="2025-10-03 14:16:01.208255274 +0000 UTC m=+911.066981521" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.246720 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" podStartSLOduration=8.910957614 podStartE2EDuration="30.246701702s" podCreationTimestamp="2025-10-03 14:15:31 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.959038394 +0000 UTC m=+883.817764641" lastFinishedPulling="2025-10-03 14:15:55.294782482 +0000 UTC m=+905.153508729" observedRunningTime="2025-10-03 14:16:01.232141308 +0000 UTC m=+911.090867555" watchObservedRunningTime="2025-10-03 14:16:01.246701702 +0000 UTC m=+911.105427949" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.250435 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx" podStartSLOduration=8.958000913 podStartE2EDuration="31.250415308s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.002331176 +0000 UTC m=+882.861057423" lastFinishedPulling="2025-10-03 14:15:55.294745571 +0000 UTC m=+905.153471818" observedRunningTime="2025-10-03 14:16:01.246020505 +0000 UTC m=+911.104746762" watchObservedRunningTime="2025-10-03 14:16:01.250415308 +0000 UTC m=+911.109141555" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.260615 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms" podStartSLOduration=8.393063002 podStartE2EDuration="31.26059739s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:32.428745383 +0000 UTC m=+882.287471620" lastFinishedPulling="2025-10-03 14:15:55.296279761 +0000 UTC m=+905.155006008" observedRunningTime="2025-10-03 14:16:01.260569799 +0000 UTC m=+911.119296066" watchObservedRunningTime="2025-10-03 14:16:01.26059739 +0000 UTC m=+911.119323637" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.277123 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" podStartSLOduration=3.542167862 podStartE2EDuration="31.277088034s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:32.598585132 +0000 UTC m=+882.457311379" lastFinishedPulling="2025-10-03 14:16:00.333505304 +0000 UTC m=+910.192231551" observedRunningTime="2025-10-03 14:16:01.275822931 +0000 UTC m=+911.134549178" watchObservedRunningTime="2025-10-03 14:16:01.277088034 +0000 UTC m=+911.135814281" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.292667 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" podStartSLOduration=3.448853241 podStartE2EDuration="31.292651194s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:32.619788327 +0000 UTC m=+882.478514574" lastFinishedPulling="2025-10-03 14:16:00.46358628 +0000 UTC m=+910.322312527" observedRunningTime="2025-10-03 14:16:01.289912194 +0000 UTC m=+911.148638441" watchObservedRunningTime="2025-10-03 14:16:01.292651194 +0000 UTC m=+911.151377441" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.309182 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk" podStartSLOduration=8.834103987 podStartE2EDuration="31.309151859s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:32.820078859 +0000 UTC m=+882.678805106" lastFinishedPulling="2025-10-03 14:15:55.295126721 +0000 UTC m=+905.153852978" observedRunningTime="2025-10-03 14:16:01.303689438 +0000 UTC m=+911.162415705" watchObservedRunningTime="2025-10-03 14:16:01.309151859 +0000 UTC m=+911.167878106" Oct 03 14:16:01 crc kubenswrapper[4636]: I1003 14:16:01.871223 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.088651 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-lc5hb" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.145827 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-8nj2c" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.146291 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-8knqr" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.147665 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-fh8lr" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.147962 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.149940 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-cbxrx" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.150188 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.152958 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-jcqmk" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.153008 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-x9xms" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.168730 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.182792 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-xj7dp" Oct 03 14:16:02 crc kubenswrapper[4636]: I1003 14:16:02.739954 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-6h5gc" Oct 03 14:16:09 crc kubenswrapper[4636]: I1003 14:16:09.163487 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:16:09 crc kubenswrapper[4636]: I1003 14:16:09.164080 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:16:10 crc kubenswrapper[4636]: I1003 14:16:10.189845 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" event={"ID":"126025f8-40af-4a27-a9cc-8ece19d269b0","Type":"ContainerStarted","Data":"5a71cc9345c9f4914e50fbf42efaeabb86d0f01370ce11157ac9e877b4233350"} Oct 03 14:16:10 crc kubenswrapper[4636]: I1003 14:16:10.191666 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" event={"ID":"d9a0c033-eaea-4336-96e6-9664f726e50e","Type":"ContainerStarted","Data":"db1dc1c1f659445924e2e44e5e84ac3d1afb95beb5401319bd0eac2ac534ad89"} Oct 03 14:16:10 crc kubenswrapper[4636]: I1003 14:16:10.191880 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" Oct 03 14:16:10 crc kubenswrapper[4636]: I1003 14:16:10.210553 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" podStartSLOduration=3.358484428 podStartE2EDuration="39.210537831s" podCreationTimestamp="2025-10-03 14:15:31 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.555287909 +0000 UTC m=+883.414014156" lastFinishedPulling="2025-10-03 14:16:09.407341312 +0000 UTC m=+919.266067559" observedRunningTime="2025-10-03 14:16:10.207978455 +0000 UTC m=+920.066704702" watchObservedRunningTime="2025-10-03 14:16:10.210537831 +0000 UTC m=+920.069264078" Oct 03 14:16:10 crc kubenswrapper[4636]: I1003 14:16:10.224218 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" podStartSLOduration=3.813395098 podStartE2EDuration="40.224202022s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:32.820198212 +0000 UTC m=+882.678924459" lastFinishedPulling="2025-10-03 14:16:09.231005136 +0000 UTC m=+919.089731383" observedRunningTime="2025-10-03 14:16:10.222269443 +0000 UTC m=+920.080995700" watchObservedRunningTime="2025-10-03 14:16:10.224202022 +0000 UTC m=+920.082928269" Oct 03 14:16:10 crc kubenswrapper[4636]: I1003 14:16:10.994385 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-z87w6" Oct 03 14:16:11 crc kubenswrapper[4636]: I1003 14:16:11.056404 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-g8m75" Oct 03 14:16:11 crc kubenswrapper[4636]: I1003 14:16:11.107319 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-nshvn" Oct 03 14:16:11 crc kubenswrapper[4636]: I1003 14:16:11.208067 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" event={"ID":"2f612d08-a478-46d7-aefd-f31051af25d9","Type":"ContainerStarted","Data":"9783a15454bb80a87fc806ee64aeda14660b16dc72da93474e607dbb287edf21"} Oct 03 14:16:11 crc kubenswrapper[4636]: I1003 14:16:11.208296 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" Oct 03 14:16:11 crc kubenswrapper[4636]: I1003 14:16:11.872815 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-lkd7z" Oct 03 14:16:11 crc kubenswrapper[4636]: I1003 14:16:11.886475 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" podStartSLOduration=3.786657992 podStartE2EDuration="40.886455618s" podCreationTimestamp="2025-10-03 14:15:31 +0000 UTC" firstStartedPulling="2025-10-03 14:15:33.301994624 +0000 UTC m=+883.160720871" lastFinishedPulling="2025-10-03 14:16:10.40179226 +0000 UTC m=+920.260518497" observedRunningTime="2025-10-03 14:16:11.224663186 +0000 UTC m=+921.083389423" watchObservedRunningTime="2025-10-03 14:16:11.886455618 +0000 UTC m=+921.745181865" Oct 03 14:16:11 crc kubenswrapper[4636]: I1003 14:16:11.892535 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-7qrjk" Oct 03 14:16:12 crc kubenswrapper[4636]: I1003 14:16:12.311143 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" Oct 03 14:16:13 crc kubenswrapper[4636]: I1003 14:16:13.231685 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" event={"ID":"89e06d08-9381-4aff-ba52-682080bd03bb","Type":"ContainerStarted","Data":"49fa92013674031c098c0c7af831c1c1403576a883af0edf04346389200727b3"} Oct 03 14:16:13 crc kubenswrapper[4636]: I1003 14:16:13.231928 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" Oct 03 14:16:13 crc kubenswrapper[4636]: I1003 14:16:13.255735 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" podStartSLOduration=3.849212418 podStartE2EDuration="43.255719686s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:32.823321172 +0000 UTC m=+882.682047419" lastFinishedPulling="2025-10-03 14:16:12.22982844 +0000 UTC m=+922.088554687" observedRunningTime="2025-10-03 14:16:13.250486782 +0000 UTC m=+923.109213029" watchObservedRunningTime="2025-10-03 14:16:13.255719686 +0000 UTC m=+923.114445933" Oct 03 14:16:14 crc kubenswrapper[4636]: I1003 14:16:14.239537 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" event={"ID":"24c6a469-5b37-4dc9-baed-6a3c54b11861","Type":"ContainerStarted","Data":"3aae5badf1fe3217ba7e948af64a45407200addad75c7e0ea5c8f90e1fcb5b8e"} Oct 03 14:16:14 crc kubenswrapper[4636]: I1003 14:16:14.255697 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" podStartSLOduration=3.47093177 podStartE2EDuration="44.255675267s" podCreationTimestamp="2025-10-03 14:15:30 +0000 UTC" firstStartedPulling="2025-10-03 14:15:32.838166694 +0000 UTC m=+882.696892941" lastFinishedPulling="2025-10-03 14:16:13.622910191 +0000 UTC m=+923.481636438" observedRunningTime="2025-10-03 14:16:14.253573053 +0000 UTC m=+924.112299320" watchObservedRunningTime="2025-10-03 14:16:14.255675267 +0000 UTC m=+924.114401514" Oct 03 14:16:21 crc kubenswrapper[4636]: I1003 14:16:21.114397 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" Oct 03 14:16:21 crc kubenswrapper[4636]: I1003 14:16:21.116159 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dwvpt" Oct 03 14:16:21 crc kubenswrapper[4636]: I1003 14:16:21.194943 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-p7d6m" Oct 03 14:16:21 crc kubenswrapper[4636]: I1003 14:16:21.476373 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-9dc5p" Oct 03 14:16:21 crc kubenswrapper[4636]: I1003 14:16:21.650952 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-48wqb" Oct 03 14:16:22 crc kubenswrapper[4636]: I1003 14:16:22.312887 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-8qdtd" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.162809 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.163494 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.229173 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f89fc"] Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.230255 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.234556 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.235135 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.235301 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.235492 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5szpx" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.242513 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f89fc"] Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.318196 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvqnr"] Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.319461 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.323657 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.330251 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvqnr"] Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.345061 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa94d03e-bd55-47a5-8813-162199936a3d-config\") pod \"dnsmasq-dns-675f4bcbfc-f89fc\" (UID: \"fa94d03e-bd55-47a5-8813-162199936a3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.345156 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mc6t\" (UniqueName: \"kubernetes.io/projected/fa94d03e-bd55-47a5-8813-162199936a3d-kube-api-access-8mc6t\") pod \"dnsmasq-dns-675f4bcbfc-f89fc\" (UID: \"fa94d03e-bd55-47a5-8813-162199936a3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.446293 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa94d03e-bd55-47a5-8813-162199936a3d-config\") pod \"dnsmasq-dns-675f4bcbfc-f89fc\" (UID: \"fa94d03e-bd55-47a5-8813-162199936a3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.447296 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2csvz\" (UniqueName: \"kubernetes.io/projected/85ebe8f6-b599-49cc-8477-6eeedf047ac4-kube-api-access-2csvz\") pod \"dnsmasq-dns-78dd6ddcc-fvqnr\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.447387 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mc6t\" (UniqueName: \"kubernetes.io/projected/fa94d03e-bd55-47a5-8813-162199936a3d-kube-api-access-8mc6t\") pod \"dnsmasq-dns-675f4bcbfc-f89fc\" (UID: \"fa94d03e-bd55-47a5-8813-162199936a3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.447233 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa94d03e-bd55-47a5-8813-162199936a3d-config\") pod \"dnsmasq-dns-675f4bcbfc-f89fc\" (UID: \"fa94d03e-bd55-47a5-8813-162199936a3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.447457 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-config\") pod \"dnsmasq-dns-78dd6ddcc-fvqnr\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.447528 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fvqnr\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.468384 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mc6t\" (UniqueName: \"kubernetes.io/projected/fa94d03e-bd55-47a5-8813-162199936a3d-kube-api-access-8mc6t\") pod \"dnsmasq-dns-675f4bcbfc-f89fc\" (UID: \"fa94d03e-bd55-47a5-8813-162199936a3d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.547978 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.548741 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2csvz\" (UniqueName: \"kubernetes.io/projected/85ebe8f6-b599-49cc-8477-6eeedf047ac4-kube-api-access-2csvz\") pod \"dnsmasq-dns-78dd6ddcc-fvqnr\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.548798 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-config\") pod \"dnsmasq-dns-78dd6ddcc-fvqnr\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.548845 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fvqnr\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.549797 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fvqnr\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.549868 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-config\") pod \"dnsmasq-dns-78dd6ddcc-fvqnr\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.566698 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2csvz\" (UniqueName: \"kubernetes.io/projected/85ebe8f6-b599-49cc-8477-6eeedf047ac4-kube-api-access-2csvz\") pod \"dnsmasq-dns-78dd6ddcc-fvqnr\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.638911 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.981495 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f89fc"] Oct 03 14:16:39 crc kubenswrapper[4636]: I1003 14:16:39.991081 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:16:40 crc kubenswrapper[4636]: W1003 14:16:40.304832 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85ebe8f6_b599_49cc_8477_6eeedf047ac4.slice/crio-b7d9a575d99f17045e58e46fc7ced4689bb813568ea3be87ec02e46ba4108ca8 WatchSource:0}: Error finding container b7d9a575d99f17045e58e46fc7ced4689bb813568ea3be87ec02e46ba4108ca8: Status 404 returned error can't find the container with id b7d9a575d99f17045e58e46fc7ced4689bb813568ea3be87ec02e46ba4108ca8 Oct 03 14:16:40 crc kubenswrapper[4636]: I1003 14:16:40.306405 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvqnr"] Oct 03 14:16:40 crc kubenswrapper[4636]: I1003 14:16:40.414409 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" event={"ID":"fa94d03e-bd55-47a5-8813-162199936a3d","Type":"ContainerStarted","Data":"d144a23637d78cda7a51e49ecddb5f101ba35bf2ed8b29ea87f164673d55891d"} Oct 03 14:16:40 crc kubenswrapper[4636]: I1003 14:16:40.415667 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" event={"ID":"85ebe8f6-b599-49cc-8477-6eeedf047ac4","Type":"ContainerStarted","Data":"b7d9a575d99f17045e58e46fc7ced4689bb813568ea3be87ec02e46ba4108ca8"} Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.213698 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f89fc"] Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.238052 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5dc2p"] Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.239190 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.301519 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5dc2p"] Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.399837 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5dc2p\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.399903 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxgs7\" (UniqueName: \"kubernetes.io/projected/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-kube-api-access-gxgs7\") pod \"dnsmasq-dns-666b6646f7-5dc2p\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.399935 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-config\") pod \"dnsmasq-dns-666b6646f7-5dc2p\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.501707 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5dc2p\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.501769 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxgs7\" (UniqueName: \"kubernetes.io/projected/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-kube-api-access-gxgs7\") pod \"dnsmasq-dns-666b6646f7-5dc2p\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.501800 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-config\") pod \"dnsmasq-dns-666b6646f7-5dc2p\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.502690 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-config\") pod \"dnsmasq-dns-666b6646f7-5dc2p\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.502696 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5dc2p\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.550218 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxgs7\" (UniqueName: \"kubernetes.io/projected/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-kube-api-access-gxgs7\") pod \"dnsmasq-dns-666b6646f7-5dc2p\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.565716 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.634905 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvqnr"] Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.653437 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxpgk"] Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.654785 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.683242 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxpgk"] Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.805237 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rxpgk\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.805273 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f6tk\" (UniqueName: \"kubernetes.io/projected/948352a1-041e-447b-b36a-97438f39d0e8-kube-api-access-8f6tk\") pod \"dnsmasq-dns-57d769cc4f-rxpgk\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.805303 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-config\") pod \"dnsmasq-dns-57d769cc4f-rxpgk\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.910657 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rxpgk\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.911008 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f6tk\" (UniqueName: \"kubernetes.io/projected/948352a1-041e-447b-b36a-97438f39d0e8-kube-api-access-8f6tk\") pod \"dnsmasq-dns-57d769cc4f-rxpgk\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.911032 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-config\") pod \"dnsmasq-dns-57d769cc4f-rxpgk\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.911723 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rxpgk\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.912030 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-config\") pod \"dnsmasq-dns-57d769cc4f-rxpgk\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:16:42 crc kubenswrapper[4636]: I1003 14:16:42.938315 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f6tk\" (UniqueName: \"kubernetes.io/projected/948352a1-041e-447b-b36a-97438f39d0e8-kube-api-access-8f6tk\") pod \"dnsmasq-dns-57d769cc4f-rxpgk\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.005761 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.096542 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5dc2p"] Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.424244 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.428457 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.435630 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.435675 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.435749 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.435794 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.435882 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.435894 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-xnwcw" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.435933 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.450614 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.498220 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" event={"ID":"c6ae5571-6f09-4f64-b071-d669dc4d3f1f","Type":"ContainerStarted","Data":"358368f9145c41364b79b28c5e9db36c2f845a766ca129aecacf9bd37e2535e3"} Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.523182 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/61bd2d74-76de-402c-99af-f18ddf19610c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.523256 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.523280 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/61bd2d74-76de-402c-99af-f18ddf19610c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.523327 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.523355 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vbvn\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-kube-api-access-6vbvn\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.523378 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.523399 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-config-data\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.523456 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.523496 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.523548 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.523581 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.566283 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxpgk"] Oct 03 14:16:43 crc kubenswrapper[4636]: W1003 14:16:43.579927 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod948352a1_041e_447b_b36a_97438f39d0e8.slice/crio-0a90a3e3bd66e3264527c0a73ac59419790907ebcb8f0f8c12bdce9194b6ecbb WatchSource:0}: Error finding container 0a90a3e3bd66e3264527c0a73ac59419790907ebcb8f0f8c12bdce9194b6ecbb: Status 404 returned error can't find the container with id 0a90a3e3bd66e3264527c0a73ac59419790907ebcb8f0f8c12bdce9194b6ecbb Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.625419 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.625526 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.625555 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.625580 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/61bd2d74-76de-402c-99af-f18ddf19610c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.625646 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.625660 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/61bd2d74-76de-402c-99af-f18ddf19610c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.625695 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.625718 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbvn\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-kube-api-access-6vbvn\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.626212 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.626398 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.626426 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-config-data\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.626447 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.626597 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.629995 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.630673 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-config-data\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.630911 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.632748 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/61bd2d74-76de-402c-99af-f18ddf19610c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.635281 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/61bd2d74-76de-402c-99af-f18ddf19610c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.640687 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.645612 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbvn\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-kube-api-access-6vbvn\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.650090 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.651442 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.678136 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.771539 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.799669 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.859426 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.862184 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.866433 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.870182 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.870295 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.870205 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hgc9d" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.870417 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.870458 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.874924 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.945480 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.945540 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.945564 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f862438-7485-4e2c-a5b5-a6f3acf809ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.945579 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.945751 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.945790 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.945828 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.945863 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2p6h\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-kube-api-access-j2p6h\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.945931 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f862438-7485-4e2c-a5b5-a6f3acf809ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.945967 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:43 crc kubenswrapper[4636]: I1003 14:16:43.945986 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.046866 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.046931 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.046981 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.047023 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.047068 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f862438-7485-4e2c-a5b5-a6f3acf809ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.047084 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.047151 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.047168 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.047189 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.047435 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.048533 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.049020 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.049470 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.049822 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2p6h\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-kube-api-access-j2p6h\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.049885 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.049912 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f862438-7485-4e2c-a5b5-a6f3acf809ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.050004 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.052700 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f862438-7485-4e2c-a5b5-a6f3acf809ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.052737 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.052801 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.053848 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f862438-7485-4e2c-a5b5-a6f3acf809ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.084465 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2p6h\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-kube-api-access-j2p6h\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.085759 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.188386 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.307037 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.512983 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61bd2d74-76de-402c-99af-f18ddf19610c","Type":"ContainerStarted","Data":"2123c5fa2a14a66e241efce58f7257e0f5dbe885fd8d2254c53a22ac88cc654e"} Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.514452 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" event={"ID":"948352a1-041e-447b-b36a-97438f39d0e8","Type":"ContainerStarted","Data":"0a90a3e3bd66e3264527c0a73ac59419790907ebcb8f0f8c12bdce9194b6ecbb"} Oct 03 14:16:44 crc kubenswrapper[4636]: I1003 14:16:44.739503 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:16:44 crc kubenswrapper[4636]: W1003 14:16:44.814397 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f862438_7485_4e2c_a5b5_a6f3acf809ab.slice/crio-3eeb83d2eabc49e32d04e8d6e7edf54b059f95f9ad21d540f4b633d992db172f WatchSource:0}: Error finding container 3eeb83d2eabc49e32d04e8d6e7edf54b059f95f9ad21d540f4b633d992db172f: Status 404 returned error can't find the container with id 3eeb83d2eabc49e32d04e8d6e7edf54b059f95f9ad21d540f4b633d992db172f Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.536818 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f862438-7485-4e2c-a5b5-a6f3acf809ab","Type":"ContainerStarted","Data":"3eeb83d2eabc49e32d04e8d6e7edf54b059f95f9ad21d540f4b633d992db172f"} Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.943405 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.944989 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.955804 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.956038 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.956188 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.956309 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-w8wxs" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.960416 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.962956 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.966063 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.980859 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/781432ad-b393-4271-8a8a-39254e422cd4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.980916 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/781432ad-b393-4271-8a8a-39254e422cd4-secrets\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.980966 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/781432ad-b393-4271-8a8a-39254e422cd4-kolla-config\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.980996 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/781432ad-b393-4271-8a8a-39254e422cd4-config-data-default\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.981036 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.981080 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/781432ad-b393-4271-8a8a-39254e422cd4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.981140 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/781432ad-b393-4271-8a8a-39254e422cd4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.981167 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2phxn\" (UniqueName: \"kubernetes.io/projected/781432ad-b393-4271-8a8a-39254e422cd4-kube-api-access-2phxn\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:45 crc kubenswrapper[4636]: I1003 14:16:45.981300 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781432ad-b393-4271-8a8a-39254e422cd4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.083004 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/781432ad-b393-4271-8a8a-39254e422cd4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.083075 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/781432ad-b393-4271-8a8a-39254e422cd4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.083125 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2phxn\" (UniqueName: \"kubernetes.io/projected/781432ad-b393-4271-8a8a-39254e422cd4-kube-api-access-2phxn\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.083150 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781432ad-b393-4271-8a8a-39254e422cd4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.083185 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/781432ad-b393-4271-8a8a-39254e422cd4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.083209 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/781432ad-b393-4271-8a8a-39254e422cd4-secrets\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.083232 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/781432ad-b393-4271-8a8a-39254e422cd4-kolla-config\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.083254 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/781432ad-b393-4271-8a8a-39254e422cd4-config-data-default\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.083276 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.083705 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/781432ad-b393-4271-8a8a-39254e422cd4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.084477 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/781432ad-b393-4271-8a8a-39254e422cd4-kolla-config\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.084761 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/781432ad-b393-4271-8a8a-39254e422cd4-config-data-default\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.084924 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781432ad-b393-4271-8a8a-39254e422cd4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.085332 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.093458 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/781432ad-b393-4271-8a8a-39254e422cd4-secrets\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.099757 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/781432ad-b393-4271-8a8a-39254e422cd4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.110291 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.112113 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/781432ad-b393-4271-8a8a-39254e422cd4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.112759 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2phxn\" (UniqueName: \"kubernetes.io/projected/781432ad-b393-4271-8a8a-39254e422cd4-kube-api-access-2phxn\") pod \"openstack-galera-0\" (UID: \"781432ad-b393-4271-8a8a-39254e422cd4\") " pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.274614 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.340936 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.342228 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.348749 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.348949 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.349135 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.349358 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nk676" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.391978 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.492119 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3439f9c-0086-413d-a84f-79e7da2ffcbd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.492158 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b3439f9c-0086-413d-a84f-79e7da2ffcbd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.492186 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpsct\" (UniqueName: \"kubernetes.io/projected/b3439f9c-0086-413d-a84f-79e7da2ffcbd-kube-api-access-zpsct\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.492222 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.492239 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b3439f9c-0086-413d-a84f-79e7da2ffcbd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.492258 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3439f9c-0086-413d-a84f-79e7da2ffcbd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.492282 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b3439f9c-0086-413d-a84f-79e7da2ffcbd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.492300 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3439f9c-0086-413d-a84f-79e7da2ffcbd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.492324 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b3439f9c-0086-413d-a84f-79e7da2ffcbd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.593844 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b3439f9c-0086-413d-a84f-79e7da2ffcbd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.593879 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3439f9c-0086-413d-a84f-79e7da2ffcbd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.593908 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b3439f9c-0086-413d-a84f-79e7da2ffcbd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.594000 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3439f9c-0086-413d-a84f-79e7da2ffcbd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.594022 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b3439f9c-0086-413d-a84f-79e7da2ffcbd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.594042 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpsct\" (UniqueName: \"kubernetes.io/projected/b3439f9c-0086-413d-a84f-79e7da2ffcbd-kube-api-access-zpsct\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.594076 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.594090 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b3439f9c-0086-413d-a84f-79e7da2ffcbd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.594124 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3439f9c-0086-413d-a84f-79e7da2ffcbd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.595334 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3439f9c-0086-413d-a84f-79e7da2ffcbd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.595566 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b3439f9c-0086-413d-a84f-79e7da2ffcbd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.597279 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b3439f9c-0086-413d-a84f-79e7da2ffcbd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.601819 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.603412 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b3439f9c-0086-413d-a84f-79e7da2ffcbd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.610517 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b3439f9c-0086-413d-a84f-79e7da2ffcbd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.610941 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3439f9c-0086-413d-a84f-79e7da2ffcbd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.612547 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3439f9c-0086-413d-a84f-79e7da2ffcbd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.626563 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpsct\" (UniqueName: \"kubernetes.io/projected/b3439f9c-0086-413d-a84f-79e7da2ffcbd-kube-api-access-zpsct\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.630619 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b3439f9c-0086-413d-a84f-79e7da2ffcbd\") " pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.748041 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.881648 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.894895 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.897362 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.897631 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.907283 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xlkng" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.907496 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 03 14:16:46 crc kubenswrapper[4636]: I1003 14:16:46.908036 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.025112 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e09844-cd33-42a1-a0dc-e1995b872663-memcached-tls-certs\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.025422 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e09844-cd33-42a1-a0dc-e1995b872663-config-data\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.025790 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkpgs\" (UniqueName: \"kubernetes.io/projected/17e09844-cd33-42a1-a0dc-e1995b872663-kube-api-access-kkpgs\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.025815 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/17e09844-cd33-42a1-a0dc-e1995b872663-kolla-config\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.026036 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e09844-cd33-42a1-a0dc-e1995b872663-combined-ca-bundle\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.130634 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e09844-cd33-42a1-a0dc-e1995b872663-combined-ca-bundle\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.130708 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e09844-cd33-42a1-a0dc-e1995b872663-memcached-tls-certs\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.130775 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e09844-cd33-42a1-a0dc-e1995b872663-config-data\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.130877 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkpgs\" (UniqueName: \"kubernetes.io/projected/17e09844-cd33-42a1-a0dc-e1995b872663-kube-api-access-kkpgs\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.130916 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/17e09844-cd33-42a1-a0dc-e1995b872663-kolla-config\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.133001 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/17e09844-cd33-42a1-a0dc-e1995b872663-kolla-config\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.133337 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e09844-cd33-42a1-a0dc-e1995b872663-config-data\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.145906 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e09844-cd33-42a1-a0dc-e1995b872663-memcached-tls-certs\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.156069 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e09844-cd33-42a1-a0dc-e1995b872663-combined-ca-bundle\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.161375 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkpgs\" (UniqueName: \"kubernetes.io/projected/17e09844-cd33-42a1-a0dc-e1995b872663-kube-api-access-kkpgs\") pod \"memcached-0\" (UID: \"17e09844-cd33-42a1-a0dc-e1995b872663\") " pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.244063 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.534960 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.634772 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"781432ad-b393-4271-8a8a-39254e422cd4","Type":"ContainerStarted","Data":"14580a026f80f60036bf58ac205e506a42a877246273bc34e9bbe9bbe6b6b92b"} Oct 03 14:16:47 crc kubenswrapper[4636]: I1003 14:16:47.970121 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 14:16:48 crc kubenswrapper[4636]: I1003 14:16:48.676219 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:16:48 crc kubenswrapper[4636]: I1003 14:16:48.677272 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 14:16:48 crc kubenswrapper[4636]: I1003 14:16:48.682255 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-dnbds" Oct 03 14:16:48 crc kubenswrapper[4636]: I1003 14:16:48.685935 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:16:48 crc kubenswrapper[4636]: I1003 14:16:48.856561 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7z9f\" (UniqueName: \"kubernetes.io/projected/25cea5cd-0d10-4569-952f-a884d6478382-kube-api-access-g7z9f\") pod \"kube-state-metrics-0\" (UID: \"25cea5cd-0d10-4569-952f-a884d6478382\") " pod="openstack/kube-state-metrics-0" Oct 03 14:16:48 crc kubenswrapper[4636]: I1003 14:16:48.958083 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7z9f\" (UniqueName: \"kubernetes.io/projected/25cea5cd-0d10-4569-952f-a884d6478382-kube-api-access-g7z9f\") pod \"kube-state-metrics-0\" (UID: \"25cea5cd-0d10-4569-952f-a884d6478382\") " pod="openstack/kube-state-metrics-0" Oct 03 14:16:48 crc kubenswrapper[4636]: I1003 14:16:48.977624 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7z9f\" (UniqueName: \"kubernetes.io/projected/25cea5cd-0d10-4569-952f-a884d6478382-kube-api-access-g7z9f\") pod \"kube-state-metrics-0\" (UID: \"25cea5cd-0d10-4569-952f-a884d6478382\") " pod="openstack/kube-state-metrics-0" Oct 03 14:16:49 crc kubenswrapper[4636]: I1003 14:16:49.018477 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.201746 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2mfj2"] Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.203148 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.208128 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.209846 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-cg7g9" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.210179 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.214521 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mfj2"] Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.230872 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2pfz4"] Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.232418 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.274394 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2pfz4"] Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.318805 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-var-run\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.318872 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62646db9-d39c-4cb1-b308-22dff51e4bcf-scripts\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.318905 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-var-log\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.318939 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-var-lib\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.318988 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/62646db9-d39c-4cb1-b308-22dff51e4bcf-var-run\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.319011 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-etc-ovs\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.319238 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/62646db9-d39c-4cb1-b308-22dff51e4bcf-var-log-ovn\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.319255 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc054158-e506-4945-b3da-50265dc1b1aa-scripts\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.319281 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ptsz\" (UniqueName: \"kubernetes.io/projected/62646db9-d39c-4cb1-b308-22dff51e4bcf-kube-api-access-4ptsz\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.319309 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62646db9-d39c-4cb1-b308-22dff51e4bcf-combined-ca-bundle\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.319353 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/62646db9-d39c-4cb1-b308-22dff51e4bcf-var-run-ovn\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.319371 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/62646db9-d39c-4cb1-b308-22dff51e4bcf-ovn-controller-tls-certs\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.319390 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5fbj\" (UniqueName: \"kubernetes.io/projected/fc054158-e506-4945-b3da-50265dc1b1aa-kube-api-access-r5fbj\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.420605 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/62646db9-d39c-4cb1-b308-22dff51e4bcf-var-log-ovn\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.420657 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc054158-e506-4945-b3da-50265dc1b1aa-scripts\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.420692 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ptsz\" (UniqueName: \"kubernetes.io/projected/62646db9-d39c-4cb1-b308-22dff51e4bcf-kube-api-access-4ptsz\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.420730 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62646db9-d39c-4cb1-b308-22dff51e4bcf-combined-ca-bundle\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.420756 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/62646db9-d39c-4cb1-b308-22dff51e4bcf-var-run-ovn\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.420776 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/62646db9-d39c-4cb1-b308-22dff51e4bcf-ovn-controller-tls-certs\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.420800 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5fbj\" (UniqueName: \"kubernetes.io/projected/fc054158-e506-4945-b3da-50265dc1b1aa-kube-api-access-r5fbj\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.420846 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-var-run\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.420880 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62646db9-d39c-4cb1-b308-22dff51e4bcf-scripts\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.420909 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-var-log\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.420950 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-var-lib\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.421009 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/62646db9-d39c-4cb1-b308-22dff51e4bcf-var-run\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.421032 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-etc-ovs\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.423989 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62646db9-d39c-4cb1-b308-22dff51e4bcf-scripts\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.424410 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-etc-ovs\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.424746 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc054158-e506-4945-b3da-50265dc1b1aa-scripts\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.425128 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/62646db9-d39c-4cb1-b308-22dff51e4bcf-var-log-ovn\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.426427 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-var-run\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.427181 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/62646db9-d39c-4cb1-b308-22dff51e4bcf-ovn-controller-tls-certs\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.437343 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/62646db9-d39c-4cb1-b308-22dff51e4bcf-var-run-ovn\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.437352 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-var-log\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.437477 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fc054158-e506-4945-b3da-50265dc1b1aa-var-lib\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.437482 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/62646db9-d39c-4cb1-b308-22dff51e4bcf-var-run\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.437830 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62646db9-d39c-4cb1-b308-22dff51e4bcf-combined-ca-bundle\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.438412 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5fbj\" (UniqueName: \"kubernetes.io/projected/fc054158-e506-4945-b3da-50265dc1b1aa-kube-api-access-r5fbj\") pod \"ovn-controller-ovs-2pfz4\" (UID: \"fc054158-e506-4945-b3da-50265dc1b1aa\") " pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.446649 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ptsz\" (UniqueName: \"kubernetes.io/projected/62646db9-d39c-4cb1-b308-22dff51e4bcf-kube-api-access-4ptsz\") pod \"ovn-controller-2mfj2\" (UID: \"62646db9-d39c-4cb1-b308-22dff51e4bcf\") " pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.521467 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mfj2" Oct 03 14:16:52 crc kubenswrapper[4636]: I1003 14:16:52.549436 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.096897 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.098417 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.107084 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.107266 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.107405 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.107884 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.113239 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6tqkz" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.119327 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.233802 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af99ddda-1ae6-4b70-9422-06c99e8664e5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.233862 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af99ddda-1ae6-4b70-9422-06c99e8664e5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.233949 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cprvg\" (UniqueName: \"kubernetes.io/projected/af99ddda-1ae6-4b70-9422-06c99e8664e5-kube-api-access-cprvg\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.234032 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af99ddda-1ae6-4b70-9422-06c99e8664e5-config\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.234061 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af99ddda-1ae6-4b70-9422-06c99e8664e5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.234126 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af99ddda-1ae6-4b70-9422-06c99e8664e5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.234176 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.234193 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af99ddda-1ae6-4b70-9422-06c99e8664e5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.335222 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cprvg\" (UniqueName: \"kubernetes.io/projected/af99ddda-1ae6-4b70-9422-06c99e8664e5-kube-api-access-cprvg\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.335297 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af99ddda-1ae6-4b70-9422-06c99e8664e5-config\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.335320 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af99ddda-1ae6-4b70-9422-06c99e8664e5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.335350 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af99ddda-1ae6-4b70-9422-06c99e8664e5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.335392 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af99ddda-1ae6-4b70-9422-06c99e8664e5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.335407 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.335431 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af99ddda-1ae6-4b70-9422-06c99e8664e5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.335448 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af99ddda-1ae6-4b70-9422-06c99e8664e5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.336159 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af99ddda-1ae6-4b70-9422-06c99e8664e5-config\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.336520 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.337071 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af99ddda-1ae6-4b70-9422-06c99e8664e5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.337313 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af99ddda-1ae6-4b70-9422-06c99e8664e5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.340280 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af99ddda-1ae6-4b70-9422-06c99e8664e5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.343497 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af99ddda-1ae6-4b70-9422-06c99e8664e5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.357058 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cprvg\" (UniqueName: \"kubernetes.io/projected/af99ddda-1ae6-4b70-9422-06c99e8664e5-kube-api-access-cprvg\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.360174 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af99ddda-1ae6-4b70-9422-06c99e8664e5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.361073 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"af99ddda-1ae6-4b70-9422-06c99e8664e5\") " pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:53 crc kubenswrapper[4636]: I1003 14:16:53.419852 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 14:16:55 crc kubenswrapper[4636]: I1003 14:16:55.703651 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"17e09844-cd33-42a1-a0dc-e1995b872663","Type":"ContainerStarted","Data":"ac203476a32fbf44ab3aa3388383fb4d51dc1d3f591de2f293266f6a1dda01a5"} Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.009197 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.010804 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.013074 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.013470 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-tnc5d" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.013853 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.013890 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.030932 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.181114 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a4510e7-aa39-4e1f-80bb-196127d2643c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.181211 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a4510e7-aa39-4e1f-80bb-196127d2643c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.181263 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a4510e7-aa39-4e1f-80bb-196127d2643c-config\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.181279 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a4510e7-aa39-4e1f-80bb-196127d2643c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.181302 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.181450 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2a4510e7-aa39-4e1f-80bb-196127d2643c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.181498 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpkw7\" (UniqueName: \"kubernetes.io/projected/2a4510e7-aa39-4e1f-80bb-196127d2643c-kube-api-access-mpkw7\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.181631 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a4510e7-aa39-4e1f-80bb-196127d2643c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.286111 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a4510e7-aa39-4e1f-80bb-196127d2643c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.286196 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a4510e7-aa39-4e1f-80bb-196127d2643c-config\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.286224 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a4510e7-aa39-4e1f-80bb-196127d2643c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.286261 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.286284 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2a4510e7-aa39-4e1f-80bb-196127d2643c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.286311 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpkw7\" (UniqueName: \"kubernetes.io/projected/2a4510e7-aa39-4e1f-80bb-196127d2643c-kube-api-access-mpkw7\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.286356 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a4510e7-aa39-4e1f-80bb-196127d2643c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.286394 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a4510e7-aa39-4e1f-80bb-196127d2643c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.293004 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a4510e7-aa39-4e1f-80bb-196127d2643c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.293791 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.294005 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2a4510e7-aa39-4e1f-80bb-196127d2643c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.296791 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a4510e7-aa39-4e1f-80bb-196127d2643c-config\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.308896 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a4510e7-aa39-4e1f-80bb-196127d2643c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.310247 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a4510e7-aa39-4e1f-80bb-196127d2643c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.317305 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a4510e7-aa39-4e1f-80bb-196127d2643c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.318611 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpkw7\" (UniqueName: \"kubernetes.io/projected/2a4510e7-aa39-4e1f-80bb-196127d2643c-kube-api-access-mpkw7\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.323829 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2a4510e7-aa39-4e1f-80bb-196127d2643c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.340991 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 14:16:56 crc kubenswrapper[4636]: I1003 14:16:56.711221 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b3439f9c-0086-413d-a84f-79e7da2ffcbd","Type":"ContainerStarted","Data":"88c2fa3dc5747e04e8bfdfacdfe8f1fd901b59199655715eb141778fa8e54e81"} Oct 03 14:17:00 crc kubenswrapper[4636]: I1003 14:17:00.672766 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 14:17:05 crc kubenswrapper[4636]: E1003 14:17:05.467607 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Oct 03 14:17:05 crc kubenswrapper[4636]: E1003 14:17:05.468740 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2phxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(781432ad-b393-4271-8a8a-39254e422cd4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:17:05 crc kubenswrapper[4636]: E1003 14:17:05.479785 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="781432ad-b393-4271-8a8a-39254e422cd4" Oct 03 14:17:05 crc kubenswrapper[4636]: I1003 14:17:05.776943 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"af99ddda-1ae6-4b70-9422-06c99e8664e5","Type":"ContainerStarted","Data":"32e3966fdea5eb25b86e2d0c95b0f2bfe816eeb6e12ebc81dd3cfc4490460463"} Oct 03 14:17:05 crc kubenswrapper[4636]: E1003 14:17:05.778543 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="781432ad-b393-4271-8a8a-39254e422cd4" Oct 03 14:17:06 crc kubenswrapper[4636]: I1003 14:17:06.082996 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2pfz4"] Oct 03 14:17:06 crc kubenswrapper[4636]: E1003 14:17:06.431089 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 14:17:06 crc kubenswrapper[4636]: E1003 14:17:06.431272 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8mc6t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-f89fc_openstack(fa94d03e-bd55-47a5-8813-162199936a3d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:17:06 crc kubenswrapper[4636]: E1003 14:17:06.433033 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" podUID="fa94d03e-bd55-47a5-8813-162199936a3d" Oct 03 14:17:06 crc kubenswrapper[4636]: E1003 14:17:06.455223 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 14:17:06 crc kubenswrapper[4636]: E1003 14:17:06.455374 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2csvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-fvqnr_openstack(85ebe8f6-b599-49cc-8477-6eeedf047ac4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:17:06 crc kubenswrapper[4636]: E1003 14:17:06.456543 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" podUID="85ebe8f6-b599-49cc-8477-6eeedf047ac4" Oct 03 14:17:06 crc kubenswrapper[4636]: E1003 14:17:06.459934 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 14:17:06 crc kubenswrapper[4636]: E1003 14:17:06.460085 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8f6tk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-rxpgk_openstack(948352a1-041e-447b-b36a-97438f39d0e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:17:06 crc kubenswrapper[4636]: E1003 14:17:06.461416 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" podUID="948352a1-041e-447b-b36a-97438f39d0e8" Oct 03 14:17:06 crc kubenswrapper[4636]: E1003 14:17:06.788992 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" podUID="948352a1-041e-447b-b36a-97438f39d0e8" Oct 03 14:17:07 crc kubenswrapper[4636]: W1003 14:17:07.170994 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc054158_e506_4945_b3da_50265dc1b1aa.slice/crio-81820aa7ee61810f3040c980f6d48764e82f6c6bc7da1f48606761fb77994416 WatchSource:0}: Error finding container 81820aa7ee61810f3040c980f6d48764e82f6c6bc7da1f48606761fb77994416: Status 404 returned error can't find the container with id 81820aa7ee61810f3040c980f6d48764e82f6c6bc7da1f48606761fb77994416 Oct 03 14:17:07 crc kubenswrapper[4636]: E1003 14:17:07.219127 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 03 14:17:07 crc kubenswrapper[4636]: E1003 14:17:07.219547 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gxgs7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-5dc2p_openstack(c6ae5571-6f09-4f64-b071-d669dc4d3f1f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:17:07 crc kubenswrapper[4636]: E1003 14:17:07.220808 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" podUID="c6ae5571-6f09-4f64-b071-d669dc4d3f1f" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.343136 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.366424 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.480752 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa94d03e-bd55-47a5-8813-162199936a3d-config\") pod \"fa94d03e-bd55-47a5-8813-162199936a3d\" (UID: \"fa94d03e-bd55-47a5-8813-162199936a3d\") " Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.481387 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2csvz\" (UniqueName: \"kubernetes.io/projected/85ebe8f6-b599-49cc-8477-6eeedf047ac4-kube-api-access-2csvz\") pod \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.481425 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-dns-svc\") pod \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.481456 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa94d03e-bd55-47a5-8813-162199936a3d-config" (OuterVolumeSpecName: "config") pod "fa94d03e-bd55-47a5-8813-162199936a3d" (UID: "fa94d03e-bd55-47a5-8813-162199936a3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.481538 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-config\") pod \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\" (UID: \"85ebe8f6-b599-49cc-8477-6eeedf047ac4\") " Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.481619 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mc6t\" (UniqueName: \"kubernetes.io/projected/fa94d03e-bd55-47a5-8813-162199936a3d-kube-api-access-8mc6t\") pod \"fa94d03e-bd55-47a5-8813-162199936a3d\" (UID: \"fa94d03e-bd55-47a5-8813-162199936a3d\") " Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.482453 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85ebe8f6-b599-49cc-8477-6eeedf047ac4" (UID: "85ebe8f6-b599-49cc-8477-6eeedf047ac4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.482923 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-config" (OuterVolumeSpecName: "config") pod "85ebe8f6-b599-49cc-8477-6eeedf047ac4" (UID: "85ebe8f6-b599-49cc-8477-6eeedf047ac4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.483174 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa94d03e-bd55-47a5-8813-162199936a3d-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.483207 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.483220 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ebe8f6-b599-49cc-8477-6eeedf047ac4-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.485247 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ebe8f6-b599-49cc-8477-6eeedf047ac4-kube-api-access-2csvz" (OuterVolumeSpecName: "kube-api-access-2csvz") pod "85ebe8f6-b599-49cc-8477-6eeedf047ac4" (UID: "85ebe8f6-b599-49cc-8477-6eeedf047ac4"). InnerVolumeSpecName "kube-api-access-2csvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.490267 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa94d03e-bd55-47a5-8813-162199936a3d-kube-api-access-8mc6t" (OuterVolumeSpecName: "kube-api-access-8mc6t") pod "fa94d03e-bd55-47a5-8813-162199936a3d" (UID: "fa94d03e-bd55-47a5-8813-162199936a3d"). InnerVolumeSpecName "kube-api-access-8mc6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.585807 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mc6t\" (UniqueName: \"kubernetes.io/projected/fa94d03e-bd55-47a5-8813-162199936a3d-kube-api-access-8mc6t\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.585841 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2csvz\" (UniqueName: \"kubernetes.io/projected/85ebe8f6-b599-49cc-8477-6eeedf047ac4-kube-api-access-2csvz\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.683484 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.803962 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mfj2"] Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.807921 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b3439f9c-0086-413d-a84f-79e7da2ffcbd","Type":"ContainerStarted","Data":"b12eeed375dac8b89fc6b823a790b37f9fdc49f58f9f89ebc920b367279f595c"} Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.820623 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" event={"ID":"fa94d03e-bd55-47a5-8813-162199936a3d","Type":"ContainerDied","Data":"d144a23637d78cda7a51e49ecddb5f101ba35bf2ed8b29ea87f164673d55891d"} Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.820786 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-f89fc" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.822948 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"17e09844-cd33-42a1-a0dc-e1995b872663","Type":"ContainerStarted","Data":"23e829af0333df4a96c7e160ec462908c4b7563d16dfd5130fba399d0d0a3e17"} Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.826513 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.829804 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2pfz4" event={"ID":"fc054158-e506-4945-b3da-50265dc1b1aa","Type":"ContainerStarted","Data":"81820aa7ee61810f3040c980f6d48764e82f6c6bc7da1f48606761fb77994416"} Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.844933 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" event={"ID":"85ebe8f6-b599-49cc-8477-6eeedf047ac4","Type":"ContainerDied","Data":"b7d9a575d99f17045e58e46fc7ced4689bb813568ea3be87ec02e46ba4108ca8"} Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.844957 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fvqnr" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.850520 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"25cea5cd-0d10-4569-952f-a884d6478382","Type":"ContainerStarted","Data":"80dc665eeadb9b61aa789e370d9001a9d44833811ed6dfcdf24cb403d9c1b380"} Oct 03 14:17:07 crc kubenswrapper[4636]: E1003 14:17:07.853970 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" podUID="c6ae5571-6f09-4f64-b071-d669dc4d3f1f" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.889160 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=9.712024365 podStartE2EDuration="21.889138634s" podCreationTimestamp="2025-10-03 14:16:46 +0000 UTC" firstStartedPulling="2025-10-03 14:16:55.099188897 +0000 UTC m=+964.957915144" lastFinishedPulling="2025-10-03 14:17:07.276303166 +0000 UTC m=+977.135029413" observedRunningTime="2025-10-03 14:17:07.867826775 +0000 UTC m=+977.726553032" watchObservedRunningTime="2025-10-03 14:17:07.889138634 +0000 UTC m=+977.747864881" Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.926411 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.966162 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvqnr"] Oct 03 14:17:07 crc kubenswrapper[4636]: I1003 14:17:07.976330 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvqnr"] Oct 03 14:17:08 crc kubenswrapper[4636]: I1003 14:17:08.006598 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f89fc"] Oct 03 14:17:08 crc kubenswrapper[4636]: I1003 14:17:08.011300 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f89fc"] Oct 03 14:17:08 crc kubenswrapper[4636]: I1003 14:17:08.802844 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ebe8f6-b599-49cc-8477-6eeedf047ac4" path="/var/lib/kubelet/pods/85ebe8f6-b599-49cc-8477-6eeedf047ac4/volumes" Oct 03 14:17:08 crc kubenswrapper[4636]: I1003 14:17:08.803261 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa94d03e-bd55-47a5-8813-162199936a3d" path="/var/lib/kubelet/pods/fa94d03e-bd55-47a5-8813-162199936a3d/volumes" Oct 03 14:17:08 crc kubenswrapper[4636]: I1003 14:17:08.858896 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mfj2" event={"ID":"62646db9-d39c-4cb1-b308-22dff51e4bcf","Type":"ContainerStarted","Data":"6af63cf2ea6bee96b6fa5bd41fcd42886a57dc27c19caa1243eb3b2066cc1cc5"} Oct 03 14:17:08 crc kubenswrapper[4636]: I1003 14:17:08.860111 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f862438-7485-4e2c-a5b5-a6f3acf809ab","Type":"ContainerStarted","Data":"01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af"} Oct 03 14:17:08 crc kubenswrapper[4636]: I1003 14:17:08.863770 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61bd2d74-76de-402c-99af-f18ddf19610c","Type":"ContainerStarted","Data":"73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8"} Oct 03 14:17:08 crc kubenswrapper[4636]: I1003 14:17:08.866910 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2a4510e7-aa39-4e1f-80bb-196127d2643c","Type":"ContainerStarted","Data":"ddec4f31ab208646a8ca126fd711c3006550c4711b878bac51b4bf629cba69e8"} Oct 03 14:17:09 crc kubenswrapper[4636]: I1003 14:17:09.163481 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:17:09 crc kubenswrapper[4636]: I1003 14:17:09.163541 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:17:09 crc kubenswrapper[4636]: I1003 14:17:09.163595 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:17:09 crc kubenswrapper[4636]: I1003 14:17:09.165112 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07c604f152aa39f3430c1f62789f7be96b3f5a7c96a65ed6157e5d00f0a88d5d"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:17:09 crc kubenswrapper[4636]: I1003 14:17:09.165203 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://07c604f152aa39f3430c1f62789f7be96b3f5a7c96a65ed6157e5d00f0a88d5d" gracePeriod=600 Oct 03 14:17:09 crc kubenswrapper[4636]: I1003 14:17:09.875637 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="07c604f152aa39f3430c1f62789f7be96b3f5a7c96a65ed6157e5d00f0a88d5d" exitCode=0 Oct 03 14:17:09 crc kubenswrapper[4636]: I1003 14:17:09.875726 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"07c604f152aa39f3430c1f62789f7be96b3f5a7c96a65ed6157e5d00f0a88d5d"} Oct 03 14:17:09 crc kubenswrapper[4636]: I1003 14:17:09.876031 4636 scope.go:117] "RemoveContainer" containerID="8c343b2c3198b919be0641d5d289b1294e3d107e0057a5a4c2427bf1f447e7a9" Oct 03 14:17:11 crc kubenswrapper[4636]: I1003 14:17:11.896757 4636 generic.go:334] "Generic (PLEG): container finished" podID="b3439f9c-0086-413d-a84f-79e7da2ffcbd" containerID="b12eeed375dac8b89fc6b823a790b37f9fdc49f58f9f89ebc920b367279f595c" exitCode=0 Oct 03 14:17:11 crc kubenswrapper[4636]: I1003 14:17:11.896853 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b3439f9c-0086-413d-a84f-79e7da2ffcbd","Type":"ContainerDied","Data":"b12eeed375dac8b89fc6b823a790b37f9fdc49f58f9f89ebc920b367279f595c"} Oct 03 14:17:12 crc kubenswrapper[4636]: I1003 14:17:12.245284 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 03 14:17:12 crc kubenswrapper[4636]: I1003 14:17:12.908705 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"1d353a53ac9390ffae337e3feef5ea083eb94bb2a25b7898e4f341f0e42163eb"} Oct 03 14:17:13 crc kubenswrapper[4636]: I1003 14:17:13.916474 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2pfz4" event={"ID":"fc054158-e506-4945-b3da-50265dc1b1aa","Type":"ContainerStarted","Data":"48fbf1c32d1a258f636f6bb1ff9449d00ab50479c7f144bfd0c05530e7e758dd"} Oct 03 14:17:14 crc kubenswrapper[4636]: I1003 14:17:14.923447 4636 generic.go:334] "Generic (PLEG): container finished" podID="fc054158-e506-4945-b3da-50265dc1b1aa" containerID="48fbf1c32d1a258f636f6bb1ff9449d00ab50479c7f144bfd0c05530e7e758dd" exitCode=0 Oct 03 14:17:14 crc kubenswrapper[4636]: I1003 14:17:14.923500 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2pfz4" event={"ID":"fc054158-e506-4945-b3da-50265dc1b1aa","Type":"ContainerDied","Data":"48fbf1c32d1a258f636f6bb1ff9449d00ab50479c7f144bfd0c05530e7e758dd"} Oct 03 14:17:15 crc kubenswrapper[4636]: I1003 14:17:15.930935 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"af99ddda-1ae6-4b70-9422-06c99e8664e5","Type":"ContainerStarted","Data":"1238365c701be486d4c997a144a09c7f7955ad1eb9bd115a2c16f3f08f5e14fb"} Oct 03 14:17:15 crc kubenswrapper[4636]: I1003 14:17:15.935408 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b3439f9c-0086-413d-a84f-79e7da2ffcbd","Type":"ContainerStarted","Data":"a9b015fdc9e733b9e955a20e18b66aafd291faa333b048a9b9f146b8b7aaed56"} Oct 03 14:17:15 crc kubenswrapper[4636]: I1003 14:17:15.937085 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2a4510e7-aa39-4e1f-80bb-196127d2643c","Type":"ContainerStarted","Data":"27ee875b12bd07dcbbfbf794424f22df9ac3c00e1305329cb2b481b0e271f022"} Oct 03 14:17:15 crc kubenswrapper[4636]: I1003 14:17:15.939372 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2pfz4" event={"ID":"fc054158-e506-4945-b3da-50265dc1b1aa","Type":"ContainerStarted","Data":"f76c8ba43827a3125bb59f768adc08ca62e7c03b86b03d9f0ce6e8cbcbeaccc4"} Oct 03 14:17:15 crc kubenswrapper[4636]: I1003 14:17:15.940719 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mfj2" event={"ID":"62646db9-d39c-4cb1-b308-22dff51e4bcf","Type":"ContainerStarted","Data":"0f10789f000ef5d25ccfac0474fa905d1b8477a1ae0f8cf9124edf5c75617c60"} Oct 03 14:17:16 crc kubenswrapper[4636]: I1003 14:17:16.948656 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-2mfj2" Oct 03 14:17:16 crc kubenswrapper[4636]: I1003 14:17:16.971528 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.033499415 podStartE2EDuration="31.971494842s" podCreationTimestamp="2025-10-03 14:16:45 +0000 UTC" firstStartedPulling="2025-10-03 14:16:56.310089362 +0000 UTC m=+966.168815609" lastFinishedPulling="2025-10-03 14:17:07.248084789 +0000 UTC m=+977.106811036" observedRunningTime="2025-10-03 14:17:16.967585761 +0000 UTC m=+986.826312008" watchObservedRunningTime="2025-10-03 14:17:16.971494842 +0000 UTC m=+986.830221089" Oct 03 14:17:17 crc kubenswrapper[4636]: I1003 14:17:17.004906 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2mfj2" podStartSLOduration=20.828232266 podStartE2EDuration="25.004878203s" podCreationTimestamp="2025-10-03 14:16:52 +0000 UTC" firstStartedPulling="2025-10-03 14:17:07.82689819 +0000 UTC m=+977.685624437" lastFinishedPulling="2025-10-03 14:17:12.003544127 +0000 UTC m=+981.862270374" observedRunningTime="2025-10-03 14:17:16.998830447 +0000 UTC m=+986.857556694" watchObservedRunningTime="2025-10-03 14:17:17.004878203 +0000 UTC m=+986.863604450" Oct 03 14:17:17 crc kubenswrapper[4636]: I1003 14:17:17.956800 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2pfz4" event={"ID":"fc054158-e506-4945-b3da-50265dc1b1aa","Type":"ContainerStarted","Data":"0b026bb98a0137677ed4ecabd4908b26fef139cd5469115777f2e6a750e3ed4b"} Oct 03 14:17:17 crc kubenswrapper[4636]: I1003 14:17:17.957166 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:17:17 crc kubenswrapper[4636]: I1003 14:17:17.979067 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2pfz4" podStartSLOduration=21.689273652 podStartE2EDuration="25.979046786s" podCreationTimestamp="2025-10-03 14:16:52 +0000 UTC" firstStartedPulling="2025-10-03 14:17:07.180743993 +0000 UTC m=+977.039470240" lastFinishedPulling="2025-10-03 14:17:11.470517127 +0000 UTC m=+981.329243374" observedRunningTime="2025-10-03 14:17:17.978574913 +0000 UTC m=+987.837301160" watchObservedRunningTime="2025-10-03 14:17:17.979046786 +0000 UTC m=+987.837773033" Oct 03 14:17:18 crc kubenswrapper[4636]: I1003 14:17:18.963649 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.039490 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxpgk"] Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.088177 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-qgprz"] Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.089644 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.107775 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-qgprz"] Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.187077 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-qgprz\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.187147 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-config\") pod \"dnsmasq-dns-7cb5889db5-qgprz\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.187474 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8th6\" (UniqueName: \"kubernetes.io/projected/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-kube-api-access-j8th6\") pod \"dnsmasq-dns-7cb5889db5-qgprz\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.288631 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8th6\" (UniqueName: \"kubernetes.io/projected/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-kube-api-access-j8th6\") pod \"dnsmasq-dns-7cb5889db5-qgprz\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.288678 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-qgprz\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.288708 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-config\") pod \"dnsmasq-dns-7cb5889db5-qgprz\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.289574 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-config\") pod \"dnsmasq-dns-7cb5889db5-qgprz\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.289840 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-qgprz\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.338277 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8th6\" (UniqueName: \"kubernetes.io/projected/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-kube-api-access-j8th6\") pod \"dnsmasq-dns-7cb5889db5-qgprz\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.410597 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.799690 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.973583 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" event={"ID":"948352a1-041e-447b-b36a-97438f39d0e8","Type":"ContainerDied","Data":"0a90a3e3bd66e3264527c0a73ac59419790907ebcb8f0f8c12bdce9194b6ecbb"} Oct 03 14:17:19 crc kubenswrapper[4636]: I1003 14:17:19.973645 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rxpgk" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.021532 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f6tk\" (UniqueName: \"kubernetes.io/projected/948352a1-041e-447b-b36a-97438f39d0e8-kube-api-access-8f6tk\") pod \"948352a1-041e-447b-b36a-97438f39d0e8\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.021630 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-config\") pod \"948352a1-041e-447b-b36a-97438f39d0e8\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.021674 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-dns-svc\") pod \"948352a1-041e-447b-b36a-97438f39d0e8\" (UID: \"948352a1-041e-447b-b36a-97438f39d0e8\") " Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.022930 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "948352a1-041e-447b-b36a-97438f39d0e8" (UID: "948352a1-041e-447b-b36a-97438f39d0e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.023571 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-config" (OuterVolumeSpecName: "config") pod "948352a1-041e-447b-b36a-97438f39d0e8" (UID: "948352a1-041e-447b-b36a-97438f39d0e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.054881 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948352a1-041e-447b-b36a-97438f39d0e8-kube-api-access-8f6tk" (OuterVolumeSpecName: "kube-api-access-8f6tk") pod "948352a1-041e-447b-b36a-97438f39d0e8" (UID: "948352a1-041e-447b-b36a-97438f39d0e8"). InnerVolumeSpecName "kube-api-access-8f6tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.126625 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f6tk\" (UniqueName: \"kubernetes.io/projected/948352a1-041e-447b-b36a-97438f39d0e8-kube-api-access-8f6tk\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.127558 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.127572 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/948352a1-041e-447b-b36a-97438f39d0e8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.172539 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.180326 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.182736 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.182973 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-tjdld" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.183349 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.183521 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.183610 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.275982 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-qgprz"] Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.331247 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.331334 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/201b506e-9cc5-4ab0-9af4-96a357d19f6e-lock\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.331373 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.331397 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/201b506e-9cc5-4ab0-9af4-96a357d19f6e-cache\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.331418 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq2tk\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-kube-api-access-tq2tk\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.336068 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxpgk"] Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.341567 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rxpgk"] Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.432925 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/201b506e-9cc5-4ab0-9af4-96a357d19f6e-lock\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.432995 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.433037 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/201b506e-9cc5-4ab0-9af4-96a357d19f6e-cache\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.433060 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq2tk\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-kube-api-access-tq2tk\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.433107 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: E1003 14:17:20.433297 4636 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:17:20 crc kubenswrapper[4636]: E1003 14:17:20.433315 4636 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:17:20 crc kubenswrapper[4636]: E1003 14:17:20.433364 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift podName:201b506e-9cc5-4ab0-9af4-96a357d19f6e nodeName:}" failed. No retries permitted until 2025-10-03 14:17:20.933345562 +0000 UTC m=+990.792071809 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift") pod "swift-storage-0" (UID: "201b506e-9cc5-4ab0-9af4-96a357d19f6e") : configmap "swift-ring-files" not found Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.433443 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.433714 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/201b506e-9cc5-4ab0-9af4-96a357d19f6e-cache\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.434474 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/201b506e-9cc5-4ab0-9af4-96a357d19f6e-lock\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.454934 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq2tk\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-kube-api-access-tq2tk\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.459961 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.803750 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="948352a1-041e-447b-b36a-97438f39d0e8" path="/var/lib/kubelet/pods/948352a1-041e-447b-b36a-97438f39d0e8/volumes" Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.941053 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:20 crc kubenswrapper[4636]: E1003 14:17:20.942304 4636 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:17:20 crc kubenswrapper[4636]: E1003 14:17:20.942330 4636 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:17:20 crc kubenswrapper[4636]: E1003 14:17:20.942367 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift podName:201b506e-9cc5-4ab0-9af4-96a357d19f6e nodeName:}" failed. No retries permitted until 2025-10-03 14:17:21.942352514 +0000 UTC m=+991.801078761 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift") pod "swift-storage-0" (UID: "201b506e-9cc5-4ab0-9af4-96a357d19f6e") : configmap "swift-ring-files" not found Oct 03 14:17:20 crc kubenswrapper[4636]: I1003 14:17:20.979757 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" event={"ID":"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1","Type":"ContainerStarted","Data":"fed58d542bdfe6697d0eeaa11d1b9210bad75d51d04f6adeb920ee794377aa6e"} Oct 03 14:17:21 crc kubenswrapper[4636]: I1003 14:17:21.955776 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:21 crc kubenswrapper[4636]: E1003 14:17:21.955999 4636 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:17:21 crc kubenswrapper[4636]: E1003 14:17:21.956140 4636 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:17:21 crc kubenswrapper[4636]: E1003 14:17:21.956195 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift podName:201b506e-9cc5-4ab0-9af4-96a357d19f6e nodeName:}" failed. No retries permitted until 2025-10-03 14:17:23.956177048 +0000 UTC m=+993.814903295 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift") pod "swift-storage-0" (UID: "201b506e-9cc5-4ab0-9af4-96a357d19f6e") : configmap "swift-ring-files" not found Oct 03 14:17:23 crc kubenswrapper[4636]: I1003 14:17:23.997654 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:23 crc kubenswrapper[4636]: E1003 14:17:23.997905 4636 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:17:23 crc kubenswrapper[4636]: E1003 14:17:23.998196 4636 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:17:23 crc kubenswrapper[4636]: E1003 14:17:23.998254 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift podName:201b506e-9cc5-4ab0-9af4-96a357d19f6e nodeName:}" failed. No retries permitted until 2025-10-03 14:17:27.998235548 +0000 UTC m=+997.856961805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift") pod "swift-storage-0" (UID: "201b506e-9cc5-4ab0-9af4-96a357d19f6e") : configmap "swift-ring-files" not found Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.012774 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"781432ad-b393-4271-8a8a-39254e422cd4","Type":"ContainerStarted","Data":"b678da20137042b24c0a41938dae0098b5fcf58f9f25550012416c959a7a5a83"} Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.014946 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"25cea5cd-0d10-4569-952f-a884d6478382","Type":"ContainerStarted","Data":"2c77e85082d5dc686be74e7abede4ccbb2b24ed06f947e770b4850e36939b31f"} Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.131952 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-qz8ds"] Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.133118 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: W1003 14:17:24.134980 4636 reflector.go:561] object-"openstack"/"swift-ring-config-data": failed to list *v1.ConfigMap: configmaps "swift-ring-config-data" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 03 14:17:24 crc kubenswrapper[4636]: E1003 14:17:24.135028 4636 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"swift-ring-config-data\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"swift-ring-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 14:17:24 crc kubenswrapper[4636]: W1003 14:17:24.135279 4636 reflector.go:561] object-"openstack"/"swift-ring-scripts": failed to list *v1.ConfigMap: configmaps "swift-ring-scripts" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 03 14:17:24 crc kubenswrapper[4636]: E1003 14:17:24.135325 4636 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"swift-ring-scripts\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"swift-ring-scripts\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 14:17:24 crc kubenswrapper[4636]: W1003 14:17:24.136898 4636 reflector.go:561] object-"openstack"/"swift-proxy-config-data": failed to list *v1.Secret: secrets "swift-proxy-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 03 14:17:24 crc kubenswrapper[4636]: E1003 14:17:24.136947 4636 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"swift-proxy-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"swift-proxy-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.156555 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qz8ds"] Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.303273 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-etc-swift\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.303323 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-combined-ca-bundle\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.303342 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-scripts\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.303371 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-ring-data-devices\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.303443 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-swiftconf\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.303486 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-dispersionconf\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.303504 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj2tb\" (UniqueName: \"kubernetes.io/projected/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-kube-api-access-zj2tb\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.405313 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-ring-data-devices\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.405424 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-swiftconf\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.405491 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-dispersionconf\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.405527 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj2tb\" (UniqueName: \"kubernetes.io/projected/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-kube-api-access-zj2tb\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.405603 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-etc-swift\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.405626 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-scripts\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.405646 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-combined-ca-bundle\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.405994 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-etc-swift\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.412432 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-swiftconf\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.413325 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-combined-ca-bundle\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.421501 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj2tb\" (UniqueName: \"kubernetes.io/projected/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-kube-api-access-zj2tb\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:24 crc kubenswrapper[4636]: I1003 14:17:24.990320 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 03 14:17:25 crc kubenswrapper[4636]: I1003 14:17:25.021427 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 14:17:25 crc kubenswrapper[4636]: I1003 14:17:25.036406 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.763273009 podStartE2EDuration="37.03638547s" podCreationTimestamp="2025-10-03 14:16:48 +0000 UTC" firstStartedPulling="2025-10-03 14:17:07.700953903 +0000 UTC m=+977.559680150" lastFinishedPulling="2025-10-03 14:17:19.974066364 +0000 UTC m=+989.832792611" observedRunningTime="2025-10-03 14:17:25.034668806 +0000 UTC m=+994.893395053" watchObservedRunningTime="2025-10-03 14:17:25.03638547 +0000 UTC m=+994.895111717" Oct 03 14:17:25 crc kubenswrapper[4636]: I1003 14:17:25.199792 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 03 14:17:25 crc kubenswrapper[4636]: I1003 14:17:25.352757 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 14:17:28 crc kubenswrapper[4636]: I1003 14:17:25.577055 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-ring-data-devices\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:28 crc kubenswrapper[4636]: I1003 14:17:25.580518 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-dispersionconf\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:28 crc kubenswrapper[4636]: I1003 14:17:25.631760 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-scripts\") pod \"swift-ring-rebalance-qz8ds\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:28 crc kubenswrapper[4636]: I1003 14:17:25.655470 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:28 crc kubenswrapper[4636]: I1003 14:17:26.748330 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 03 14:17:28 crc kubenswrapper[4636]: I1003 14:17:26.748598 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 03 14:17:28 crc kubenswrapper[4636]: I1003 14:17:28.062031 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:28 crc kubenswrapper[4636]: E1003 14:17:28.062265 4636 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:17:28 crc kubenswrapper[4636]: E1003 14:17:28.062281 4636 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:17:28 crc kubenswrapper[4636]: E1003 14:17:28.062334 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift podName:201b506e-9cc5-4ab0-9af4-96a357d19f6e nodeName:}" failed. No retries permitted until 2025-10-03 14:17:36.062316843 +0000 UTC m=+1005.921043090 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift") pod "swift-storage-0" (UID: "201b506e-9cc5-4ab0-9af4-96a357d19f6e") : configmap "swift-ring-files" not found Oct 03 14:17:29 crc kubenswrapper[4636]: I1003 14:17:29.024618 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 14:17:29 crc kubenswrapper[4636]: I1003 14:17:29.289570 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qz8ds"] Oct 03 14:17:33 crc kubenswrapper[4636]: I1003 14:17:33.088399 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qz8ds" event={"ID":"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7","Type":"ContainerStarted","Data":"1f5f2f2a43f22f1751970aecb30772b6aec59d9ac2f962da4b38c292a5a07ea1"} Oct 03 14:17:36 crc kubenswrapper[4636]: I1003 14:17:36.091211 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:36 crc kubenswrapper[4636]: E1003 14:17:36.091231 4636 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:17:36 crc kubenswrapper[4636]: E1003 14:17:36.091971 4636 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:17:36 crc kubenswrapper[4636]: E1003 14:17:36.092034 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift podName:201b506e-9cc5-4ab0-9af4-96a357d19f6e nodeName:}" failed. No retries permitted until 2025-10-03 14:17:52.092013124 +0000 UTC m=+1021.950739371 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift") pod "swift-storage-0" (UID: "201b506e-9cc5-4ab0-9af4-96a357d19f6e") : configmap "swift-ring-files" not found Oct 03 14:17:41 crc kubenswrapper[4636]: I1003 14:17:41.140650 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" event={"ID":"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1","Type":"ContainerStarted","Data":"203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3"} Oct 03 14:17:41 crc kubenswrapper[4636]: I1003 14:17:41.142426 4636 generic.go:334] "Generic (PLEG): container finished" podID="5f862438-7485-4e2c-a5b5-a6f3acf809ab" containerID="01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af" exitCode=0 Oct 03 14:17:41 crc kubenswrapper[4636]: I1003 14:17:41.142485 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f862438-7485-4e2c-a5b5-a6f3acf809ab","Type":"ContainerDied","Data":"01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af"} Oct 03 14:17:41 crc kubenswrapper[4636]: I1003 14:17:41.144744 4636 generic.go:334] "Generic (PLEG): container finished" podID="61bd2d74-76de-402c-99af-f18ddf19610c" containerID="73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8" exitCode=0 Oct 03 14:17:41 crc kubenswrapper[4636]: I1003 14:17:41.144833 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61bd2d74-76de-402c-99af-f18ddf19610c","Type":"ContainerDied","Data":"73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8"} Oct 03 14:17:41 crc kubenswrapper[4636]: I1003 14:17:41.148892 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" event={"ID":"c6ae5571-6f09-4f64-b071-d669dc4d3f1f","Type":"ContainerStarted","Data":"cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1"} Oct 03 14:17:41 crc kubenswrapper[4636]: I1003 14:17:41.253746 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 03 14:17:41 crc kubenswrapper[4636]: E1003 14:17:41.311378 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified" Oct 03 14:17:41 crc kubenswrapper[4636]: E1003 14:17:41.311552 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mpkw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(2a4510e7-aa39-4e1f-80bb-196127d2643c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:17:41 crc kubenswrapper[4636]: E1003 14:17:41.312731 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="2a4510e7-aa39-4e1f-80bb-196127d2643c" Oct 03 14:17:41 crc kubenswrapper[4636]: I1003 14:17:41.331025 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="b3439f9c-0086-413d-a84f-79e7da2ffcbd" containerName="galera" probeResult="failure" output=< Oct 03 14:17:41 crc kubenswrapper[4636]: wsrep_local_state_comment (Joined) differs from Synced Oct 03 14:17:41 crc kubenswrapper[4636]: > Oct 03 14:17:42 crc kubenswrapper[4636]: I1003 14:17:42.158907 4636 generic.go:334] "Generic (PLEG): container finished" podID="c6ae5571-6f09-4f64-b071-d669dc4d3f1f" containerID="cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1" exitCode=0 Oct 03 14:17:42 crc kubenswrapper[4636]: I1003 14:17:42.158971 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" event={"ID":"c6ae5571-6f09-4f64-b071-d669dc4d3f1f","Type":"ContainerDied","Data":"cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1"} Oct 03 14:17:42 crc kubenswrapper[4636]: I1003 14:17:42.162598 4636 generic.go:334] "Generic (PLEG): container finished" podID="a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" containerID="203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3" exitCode=0 Oct 03 14:17:42 crc kubenswrapper[4636]: I1003 14:17:42.162721 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" event={"ID":"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1","Type":"ContainerDied","Data":"203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3"} Oct 03 14:17:42 crc kubenswrapper[4636]: I1003 14:17:42.168806 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"af99ddda-1ae6-4b70-9422-06c99e8664e5","Type":"ContainerStarted","Data":"2daf8dcc57a6ab8a717af379f8b155def2cbed5375bbdf2e4d101fd8c8cffb9a"} Oct 03 14:17:42 crc kubenswrapper[4636]: I1003 14:17:42.171995 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f862438-7485-4e2c-a5b5-a6f3acf809ab","Type":"ContainerStarted","Data":"8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5"} Oct 03 14:17:42 crc kubenswrapper[4636]: I1003 14:17:42.172457 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:17:42 crc kubenswrapper[4636]: I1003 14:17:42.174025 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61bd2d74-76de-402c-99af-f18ddf19610c","Type":"ContainerStarted","Data":"a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304"} Oct 03 14:17:42 crc kubenswrapper[4636]: I1003 14:17:42.234529 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.359418047 podStartE2EDuration="1m0.234508515s" podCreationTimestamp="2025-10-03 14:16:42 +0000 UTC" firstStartedPulling="2025-10-03 14:16:44.318592849 +0000 UTC m=+954.177319096" lastFinishedPulling="2025-10-03 14:17:07.193683317 +0000 UTC m=+977.052409564" observedRunningTime="2025-10-03 14:17:42.230236725 +0000 UTC m=+1012.088962972" watchObservedRunningTime="2025-10-03 14:17:42.234508515 +0000 UTC m=+1012.093234762" Oct 03 14:17:42 crc kubenswrapper[4636]: I1003 14:17:42.280301 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=14.453638019 podStartE2EDuration="50.280276385s" podCreationTimestamp="2025-10-03 14:16:52 +0000 UTC" firstStartedPulling="2025-10-03 14:17:05.488192143 +0000 UTC m=+975.346918390" lastFinishedPulling="2025-10-03 14:17:41.314830509 +0000 UTC m=+1011.173556756" observedRunningTime="2025-10-03 14:17:42.275959644 +0000 UTC m=+1012.134685901" watchObservedRunningTime="2025-10-03 14:17:42.280276385 +0000 UTC m=+1012.139002632" Oct 03 14:17:42 crc kubenswrapper[4636]: I1003 14:17:42.309483 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.754168784 podStartE2EDuration="1m0.309460728s" podCreationTimestamp="2025-10-03 14:16:42 +0000 UTC" firstStartedPulling="2025-10-03 14:16:44.835016741 +0000 UTC m=+954.693742988" lastFinishedPulling="2025-10-03 14:17:07.390308675 +0000 UTC m=+977.249034932" observedRunningTime="2025-10-03 14:17:42.30063436 +0000 UTC m=+1012.159360627" watchObservedRunningTime="2025-10-03 14:17:42.309460728 +0000 UTC m=+1012.168186975" Oct 03 14:17:43 crc kubenswrapper[4636]: I1003 14:17:43.420255 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 03 14:17:43 crc kubenswrapper[4636]: I1003 14:17:43.773007 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 14:17:44 crc kubenswrapper[4636]: I1003 14:17:44.341540 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 03 14:17:44 crc kubenswrapper[4636]: I1003 14:17:44.400000 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 03 14:17:44 crc kubenswrapper[4636]: I1003 14:17:44.423459 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 03 14:17:44 crc kubenswrapper[4636]: I1003 14:17:44.469897 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.246874 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.561774 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5dc2p"] Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.631932 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-z2z7s"] Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.633131 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.658010 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.693853 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-z2z7s"] Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.758580 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjrgq\" (UniqueName: \"kubernetes.io/projected/d73f8680-f04e-4b8f-9a56-c0a4921e950a-kube-api-access-vjrgq\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.759130 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.759242 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-config\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.759338 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.801549 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4f5ff"] Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.802621 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.805149 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.809054 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4f5ff"] Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.861342 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/291d0189-08a0-4b8b-8406-8601de0e3708-ovn-rundir\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.861396 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/291d0189-08a0-4b8b-8406-8601de0e3708-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.861430 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291d0189-08a0-4b8b-8406-8601de0e3708-combined-ca-bundle\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.861454 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjrgq\" (UniqueName: \"kubernetes.io/projected/d73f8680-f04e-4b8f-9a56-c0a4921e950a-kube-api-access-vjrgq\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.861517 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.861536 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-config\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.861559 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.861592 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw6dq\" (UniqueName: \"kubernetes.io/projected/291d0189-08a0-4b8b-8406-8601de0e3708-kube-api-access-fw6dq\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.861623 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/291d0189-08a0-4b8b-8406-8601de0e3708-config\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.861653 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/291d0189-08a0-4b8b-8406-8601de0e3708-ovs-rundir\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.862705 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.863061 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.863512 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-config\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.879920 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjrgq\" (UniqueName: \"kubernetes.io/projected/d73f8680-f04e-4b8f-9a56-c0a4921e950a-kube-api-access-vjrgq\") pod \"dnsmasq-dns-74f6f696b9-z2z7s\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.951024 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.963091 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/291d0189-08a0-4b8b-8406-8601de0e3708-ovn-rundir\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.963468 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/291d0189-08a0-4b8b-8406-8601de0e3708-ovn-rundir\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.963222 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/291d0189-08a0-4b8b-8406-8601de0e3708-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.963804 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291d0189-08a0-4b8b-8406-8601de0e3708-combined-ca-bundle\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.963967 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw6dq\" (UniqueName: \"kubernetes.io/projected/291d0189-08a0-4b8b-8406-8601de0e3708-kube-api-access-fw6dq\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.964030 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/291d0189-08a0-4b8b-8406-8601de0e3708-config\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.964074 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/291d0189-08a0-4b8b-8406-8601de0e3708-ovs-rundir\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.964208 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/291d0189-08a0-4b8b-8406-8601de0e3708-ovs-rundir\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.964887 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/291d0189-08a0-4b8b-8406-8601de0e3708-config\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.967524 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/291d0189-08a0-4b8b-8406-8601de0e3708-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.968023 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291d0189-08a0-4b8b-8406-8601de0e3708-combined-ca-bundle\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:45 crc kubenswrapper[4636]: I1003 14:17:45.983218 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw6dq\" (UniqueName: \"kubernetes.io/projected/291d0189-08a0-4b8b-8406-8601de0e3708-kube-api-access-fw6dq\") pod \"ovn-controller-metrics-4f5ff\" (UID: \"291d0189-08a0-4b8b-8406-8601de0e3708\") " pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.131265 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4f5ff" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.233901 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" event={"ID":"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1","Type":"ContainerStarted","Data":"de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7"} Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.234951 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.249218 4636 generic.go:334] "Generic (PLEG): container finished" podID="781432ad-b393-4271-8a8a-39254e422cd4" containerID="b678da20137042b24c0a41938dae0098b5fcf58f9f25550012416c959a7a5a83" exitCode=0 Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.249299 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"781432ad-b393-4271-8a8a-39254e422cd4","Type":"ContainerDied","Data":"b678da20137042b24c0a41938dae0098b5fcf58f9f25550012416c959a7a5a83"} Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.269955 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" podStartSLOduration=10.306650489 podStartE2EDuration="27.269925931s" podCreationTimestamp="2025-10-03 14:17:19 +0000 UTC" firstStartedPulling="2025-10-03 14:17:20.29553915 +0000 UTC m=+990.154265397" lastFinishedPulling="2025-10-03 14:17:37.258814582 +0000 UTC m=+1007.117540839" observedRunningTime="2025-10-03 14:17:46.26135069 +0000 UTC m=+1016.120076947" watchObservedRunningTime="2025-10-03 14:17:46.269925931 +0000 UTC m=+1016.128652178" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.272400 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2a4510e7-aa39-4e1f-80bb-196127d2643c","Type":"ContainerStarted","Data":"6cf173cda9db1b4488212933ba6db7c7f201e02191fe2980320b80a3d645e7cb"} Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.273501 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.304569 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" podUID="c6ae5571-6f09-4f64-b071-d669dc4d3f1f" containerName="dnsmasq-dns" containerID="cri-o://31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549" gracePeriod=10 Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.305045 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" event={"ID":"c6ae5571-6f09-4f64-b071-d669dc4d3f1f","Type":"ContainerStarted","Data":"31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549"} Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.305117 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.314495 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-qgprz"] Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.359251 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qz8ds" event={"ID":"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7","Type":"ContainerStarted","Data":"e8fff478098aff86770ab075a906118add1e2a5e10185f72c8bfd3fd501935af"} Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.382552 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=47.59625528 podStartE2EDuration="52.382443791s" podCreationTimestamp="2025-10-03 14:16:54 +0000 UTC" firstStartedPulling="2025-10-03 14:17:07.937393128 +0000 UTC m=+977.796119375" lastFinishedPulling="2025-10-03 14:17:12.723581639 +0000 UTC m=+982.582307886" observedRunningTime="2025-10-03 14:17:46.317045755 +0000 UTC m=+1016.175771992" watchObservedRunningTime="2025-10-03 14:17:46.382443791 +0000 UTC m=+1016.241170038" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.403906 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.439329 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" podStartSLOduration=9.715465742 podStartE2EDuration="1m4.439309967s" podCreationTimestamp="2025-10-03 14:16:42 +0000 UTC" firstStartedPulling="2025-10-03 14:16:43.1429613 +0000 UTC m=+953.001687547" lastFinishedPulling="2025-10-03 14:17:37.866805515 +0000 UTC m=+1007.725531772" observedRunningTime="2025-10-03 14:17:46.350490408 +0000 UTC m=+1016.209216665" watchObservedRunningTime="2025-10-03 14:17:46.439309967 +0000 UTC m=+1016.298036214" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.461899 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-68zzd"] Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.463718 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.472044 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-qz8ds" podStartSLOduration=10.067403983 podStartE2EDuration="22.47202278s" podCreationTimestamp="2025-10-03 14:17:24 +0000 UTC" firstStartedPulling="2025-10-03 14:17:33.077951257 +0000 UTC m=+1002.936677504" lastFinishedPulling="2025-10-03 14:17:45.482570044 +0000 UTC m=+1015.341296301" observedRunningTime="2025-10-03 14:17:46.438334432 +0000 UTC m=+1016.297060669" watchObservedRunningTime="2025-10-03 14:17:46.47202278 +0000 UTC m=+1016.330749027" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.478431 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.479453 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-68zzd"] Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.577816 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-config\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.577921 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktps9\" (UniqueName: \"kubernetes.io/projected/3e923689-aa01-44f4-941e-56418b1c3fe5-kube-api-access-ktps9\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.577991 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-dns-svc\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.578011 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.578075 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.646435 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-z2z7s"] Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.679265 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-dns-svc\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.679309 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.679367 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.679392 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-config\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.679473 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktps9\" (UniqueName: \"kubernetes.io/projected/3e923689-aa01-44f4-941e-56418b1c3fe5-kube-api-access-ktps9\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.680415 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-dns-svc\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.680515 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.681020 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-config\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.681209 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.705943 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktps9\" (UniqueName: \"kubernetes.io/projected/3e923689-aa01-44f4-941e-56418b1c3fe5-kube-api-access-ktps9\") pod \"dnsmasq-dns-698758b865-68zzd\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.804822 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.874881 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4f5ff"] Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.932803 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.934123 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.937837 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-l5l44" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.939261 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.939526 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.939711 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.973458 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.981765 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.993757 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-config\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.993810 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.993854 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-scripts\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.993877 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.993926 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.993968 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8x6r\" (UniqueName: \"kubernetes.io/projected/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-kube-api-access-l8x6r\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.994023 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:46 crc kubenswrapper[4636]: I1003 14:17:46.994331 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.095657 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-dns-svc\") pod \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.095725 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-config\") pod \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.095774 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxgs7\" (UniqueName: \"kubernetes.io/projected/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-kube-api-access-gxgs7\") pod \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\" (UID: \"c6ae5571-6f09-4f64-b071-d669dc4d3f1f\") " Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.096021 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8x6r\" (UniqueName: \"kubernetes.io/projected/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-kube-api-access-l8x6r\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.096141 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.096210 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-config\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.096255 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.096333 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-scripts\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.096364 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.096384 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.097425 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.097910 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-config\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.098378 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-scripts\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.101579 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.104427 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.107436 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-kube-api-access-gxgs7" (OuterVolumeSpecName: "kube-api-access-gxgs7") pod "c6ae5571-6f09-4f64-b071-d669dc4d3f1f" (UID: "c6ae5571-6f09-4f64-b071-d669dc4d3f1f"). InnerVolumeSpecName "kube-api-access-gxgs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.113856 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.136652 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8x6r\" (UniqueName: \"kubernetes.io/projected/9017beb0-a89a-4efa-b304-ee0ab7a8ce54-kube-api-access-l8x6r\") pod \"ovn-northd-0\" (UID: \"9017beb0-a89a-4efa-b304-ee0ab7a8ce54\") " pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.186404 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c6ae5571-6f09-4f64-b071-d669dc4d3f1f" (UID: "c6ae5571-6f09-4f64-b071-d669dc4d3f1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.188151 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-config" (OuterVolumeSpecName: "config") pod "c6ae5571-6f09-4f64-b071-d669dc4d3f1f" (UID: "c6ae5571-6f09-4f64-b071-d669dc4d3f1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.197639 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.197664 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.197673 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxgs7\" (UniqueName: \"kubernetes.io/projected/c6ae5571-6f09-4f64-b071-d669dc4d3f1f-kube-api-access-gxgs7\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.267875 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.379462 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"781432ad-b393-4271-8a8a-39254e422cd4","Type":"ContainerStarted","Data":"0d7793a19ffd7ea0e77d2e7b888ded6ee3d5289d99f37cce33a7beb280b18909"} Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.391036 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4f5ff" event={"ID":"291d0189-08a0-4b8b-8406-8601de0e3708","Type":"ContainerStarted","Data":"4a78662fe6c3d51076f8f5bcb47f6cbb55277a3c8e7c14de52ca3b9d20c2a3c4"} Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.391077 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4f5ff" event={"ID":"291d0189-08a0-4b8b-8406-8601de0e3708","Type":"ContainerStarted","Data":"c9b7d420ba6433be89a62405d482c3ac3f257af69e1aa44543c9fe938f027ff7"} Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.415041 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371973.439754 podStartE2EDuration="1m3.415022021s" podCreationTimestamp="2025-10-03 14:16:44 +0000 UTC" firstStartedPulling="2025-10-03 14:16:46.90989468 +0000 UTC m=+956.768620927" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:17:47.411699385 +0000 UTC m=+1017.270425632" watchObservedRunningTime="2025-10-03 14:17:47.415022021 +0000 UTC m=+1017.273748268" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.419260 4636 generic.go:334] "Generic (PLEG): container finished" podID="c6ae5571-6f09-4f64-b071-d669dc4d3f1f" containerID="31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549" exitCode=0 Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.419316 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.419326 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" event={"ID":"c6ae5571-6f09-4f64-b071-d669dc4d3f1f","Type":"ContainerDied","Data":"31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549"} Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.433088 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5dc2p" event={"ID":"c6ae5571-6f09-4f64-b071-d669dc4d3f1f","Type":"ContainerDied","Data":"358368f9145c41364b79b28c5e9db36c2f845a766ca129aecacf9bd37e2535e3"} Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.433154 4636 scope.go:117] "RemoveContainer" containerID="31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.439913 4636 generic.go:334] "Generic (PLEG): container finished" podID="d73f8680-f04e-4b8f-9a56-c0a4921e950a" containerID="22283cf851146ffe37855d0e1e7fd4f60bf0acb825bb87a5691a54e864985b54" exitCode=0 Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.448140 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" event={"ID":"d73f8680-f04e-4b8f-9a56-c0a4921e950a","Type":"ContainerDied","Data":"22283cf851146ffe37855d0e1e7fd4f60bf0acb825bb87a5691a54e864985b54"} Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.448197 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" event={"ID":"d73f8680-f04e-4b8f-9a56-c0a4921e950a","Type":"ContainerStarted","Data":"3ea84417305aad313dafe7c485d49426148c971769c80c6fbd99e6a9603ae116"} Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.503193 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4f5ff" podStartSLOduration=2.503167553 podStartE2EDuration="2.503167553s" podCreationTimestamp="2025-10-03 14:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:17:47.444437169 +0000 UTC m=+1017.303163406" watchObservedRunningTime="2025-10-03 14:17:47.503167553 +0000 UTC m=+1017.361893800" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.523956 4636 scope.go:117] "RemoveContainer" containerID="cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.605950 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5dc2p"] Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.620838 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5dc2p"] Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.627485 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-2mfj2" podUID="62646db9-d39c-4cb1-b308-22dff51e4bcf" containerName="ovn-controller" probeResult="failure" output=< Oct 03 14:17:47 crc kubenswrapper[4636]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 14:17:47 crc kubenswrapper[4636]: > Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.649521 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-68zzd"] Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.700704 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.701856 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2pfz4" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.732367 4636 scope.go:117] "RemoveContainer" containerID="31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549" Oct 03 14:17:47 crc kubenswrapper[4636]: E1003 14:17:47.732858 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549\": container with ID starting with 31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549 not found: ID does not exist" containerID="31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.732883 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549"} err="failed to get container status \"31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549\": rpc error: code = NotFound desc = could not find container \"31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549\": container with ID starting with 31ffbab704418cf4f3efd87189d19e5d9f8eabd576b99d4c50560183d947a549 not found: ID does not exist" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.732902 4636 scope.go:117] "RemoveContainer" containerID="cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1" Oct 03 14:17:47 crc kubenswrapper[4636]: E1003 14:17:47.735141 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1\": container with ID starting with cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1 not found: ID does not exist" containerID="cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.735174 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1"} err="failed to get container status \"cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1\": rpc error: code = NotFound desc = could not find container \"cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1\": container with ID starting with cb7f050c8bb4ead1c1c0e3658d7b15df40cfd3bebfae0db7a9b5d72dfde820d1 not found: ID does not exist" Oct 03 14:17:47 crc kubenswrapper[4636]: W1003 14:17:47.933441 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9017beb0_a89a_4efa_b304_ee0ab7a8ce54.slice/crio-c2a52e5eb8f70c361c844f2b584f8a98bc0ed83e0a514ee74149a1f0ab445fc0 WatchSource:0}: Error finding container c2a52e5eb8f70c361c844f2b584f8a98bc0ed83e0a514ee74149a1f0ab445fc0: Status 404 returned error can't find the container with id c2a52e5eb8f70c361c844f2b584f8a98bc0ed83e0a514ee74149a1f0ab445fc0 Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.937944 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.988895 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2mfj2-config-lmzq8"] Oct 03 14:17:47 crc kubenswrapper[4636]: E1003 14:17:47.989251 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ae5571-6f09-4f64-b071-d669dc4d3f1f" containerName="init" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.989269 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ae5571-6f09-4f64-b071-d669dc4d3f1f" containerName="init" Oct 03 14:17:47 crc kubenswrapper[4636]: E1003 14:17:47.989284 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ae5571-6f09-4f64-b071-d669dc4d3f1f" containerName="dnsmasq-dns" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.989290 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ae5571-6f09-4f64-b071-d669dc4d3f1f" containerName="dnsmasq-dns" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.989441 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ae5571-6f09-4f64-b071-d669dc4d3f1f" containerName="dnsmasq-dns" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.989950 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:47 crc kubenswrapper[4636]: I1003 14:17:47.993204 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.004812 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mfj2-config-lmzq8"] Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.025750 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run-ovn\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.025788 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-scripts\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.025975 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxwn7\" (UniqueName: \"kubernetes.io/projected/a171cfba-a302-4029-ab34-5b7a8f4146d4-kube-api-access-wxwn7\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.026192 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.026250 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-log-ovn\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.026279 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-additional-scripts\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.127364 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run-ovn\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.127703 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run-ovn\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.129811 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-scripts\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.127730 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-scripts\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.130125 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxwn7\" (UniqueName: \"kubernetes.io/projected/a171cfba-a302-4029-ab34-5b7a8f4146d4-kube-api-access-wxwn7\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.130710 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.130860 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.131076 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-log-ovn\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.131190 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-log-ovn\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.131302 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-additional-scripts\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.131908 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-additional-scripts\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.146890 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxwn7\" (UniqueName: \"kubernetes.io/projected/a171cfba-a302-4029-ab34-5b7a8f4146d4-kube-api-access-wxwn7\") pod \"ovn-controller-2mfj2-config-lmzq8\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.319809 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.477435 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" event={"ID":"d73f8680-f04e-4b8f-9a56-c0a4921e950a","Type":"ContainerStarted","Data":"9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98"} Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.477754 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.479193 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9017beb0-a89a-4efa-b304-ee0ab7a8ce54","Type":"ContainerStarted","Data":"c2a52e5eb8f70c361c844f2b584f8a98bc0ed83e0a514ee74149a1f0ab445fc0"} Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.480585 4636 generic.go:334] "Generic (PLEG): container finished" podID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerID="c2f9e50356dae7c0b3c9bc71ae0827e7ad93c2b989d2c9ef131f81aced467fea" exitCode=0 Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.481246 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-68zzd" event={"ID":"3e923689-aa01-44f4-941e-56418b1c3fe5","Type":"ContainerDied","Data":"c2f9e50356dae7c0b3c9bc71ae0827e7ad93c2b989d2c9ef131f81aced467fea"} Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.481265 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-68zzd" event={"ID":"3e923689-aa01-44f4-941e-56418b1c3fe5","Type":"ContainerStarted","Data":"8f7d6643cd2557bbe5430c8ea00836d9746c6aa9a2cc16b2f7e320f628c86502"} Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.484164 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" podUID="a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" containerName="dnsmasq-dns" containerID="cri-o://de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7" gracePeriod=10 Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.516052 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" podStartSLOduration=3.516026663 podStartE2EDuration="3.516026663s" podCreationTimestamp="2025-10-03 14:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:17:48.511618379 +0000 UTC m=+1018.370344646" watchObservedRunningTime="2025-10-03 14:17:48.516026663 +0000 UTC m=+1018.374752920" Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.617358 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2mfj2-config-lmzq8"] Oct 03 14:17:48 crc kubenswrapper[4636]: W1003 14:17:48.632611 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda171cfba_a302_4029_ab34_5b7a8f4146d4.slice/crio-6b11be083bfb69859b8c971b11eb2009956abbe9c8a1bc027195ab5b3a6cf626 WatchSource:0}: Error finding container 6b11be083bfb69859b8c971b11eb2009956abbe9c8a1bc027195ab5b3a6cf626: Status 404 returned error can't find the container with id 6b11be083bfb69859b8c971b11eb2009956abbe9c8a1bc027195ab5b3a6cf626 Oct 03 14:17:48 crc kubenswrapper[4636]: I1003 14:17:48.814295 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ae5571-6f09-4f64-b071-d669dc4d3f1f" path="/var/lib/kubelet/pods/c6ae5571-6f09-4f64-b071-d669dc4d3f1f/volumes" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.118569 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.161573 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-dns-svc\") pod \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.161752 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-config\") pod \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.161799 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8th6\" (UniqueName: \"kubernetes.io/projected/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-kube-api-access-j8th6\") pod \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\" (UID: \"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1\") " Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.167735 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-kube-api-access-j8th6" (OuterVolumeSpecName: "kube-api-access-j8th6") pod "a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" (UID: "a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1"). InnerVolumeSpecName "kube-api-access-j8th6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.260416 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" (UID: "a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.263811 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.263838 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8th6\" (UniqueName: \"kubernetes.io/projected/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-kube-api-access-j8th6\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.265622 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-config" (OuterVolumeSpecName: "config") pod "a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" (UID: "a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.365739 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.492399 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-68zzd" event={"ID":"3e923689-aa01-44f4-941e-56418b1c3fe5","Type":"ContainerStarted","Data":"c28b53c7e699590d302089050ae3122674a36c7fbdb9e56e50be35b6965e2212"} Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.493497 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.497340 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mfj2-config-lmzq8" event={"ID":"a171cfba-a302-4029-ab34-5b7a8f4146d4","Type":"ContainerStarted","Data":"3b3b20ccfcd23fcda9b3081643d0e21ccef7acc175e1f050656c5361184fec2a"} Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.497381 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mfj2-config-lmzq8" event={"ID":"a171cfba-a302-4029-ab34-5b7a8f4146d4","Type":"ContainerStarted","Data":"6b11be083bfb69859b8c971b11eb2009956abbe9c8a1bc027195ab5b3a6cf626"} Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.501825 4636 generic.go:334] "Generic (PLEG): container finished" podID="a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" containerID="de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7" exitCode=0 Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.502543 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.504620 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" event={"ID":"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1","Type":"ContainerDied","Data":"de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7"} Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.504690 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-qgprz" event={"ID":"a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1","Type":"ContainerDied","Data":"fed58d542bdfe6697d0eeaa11d1b9210bad75d51d04f6adeb920ee794377aa6e"} Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.504710 4636 scope.go:117] "RemoveContainer" containerID="de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.517019 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-68zzd" podStartSLOduration=3.516997666 podStartE2EDuration="3.516997666s" podCreationTimestamp="2025-10-03 14:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:17:49.510303793 +0000 UTC m=+1019.369030040" watchObservedRunningTime="2025-10-03 14:17:49.516997666 +0000 UTC m=+1019.375723903" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.541091 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2mfj2-config-lmzq8" podStartSLOduration=2.541072806 podStartE2EDuration="2.541072806s" podCreationTimestamp="2025-10-03 14:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:17:49.539768183 +0000 UTC m=+1019.398494430" watchObservedRunningTime="2025-10-03 14:17:49.541072806 +0000 UTC m=+1019.399799053" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.559686 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-qgprz"] Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.569579 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-qgprz"] Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.736579 4636 scope.go:117] "RemoveContainer" containerID="203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.755013 4636 scope.go:117] "RemoveContainer" containerID="de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7" Oct 03 14:17:49 crc kubenswrapper[4636]: E1003 14:17:49.755400 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7\": container with ID starting with de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7 not found: ID does not exist" containerID="de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.755429 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7"} err="failed to get container status \"de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7\": rpc error: code = NotFound desc = could not find container \"de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7\": container with ID starting with de0ccd39fd44972fe1cecd21db24b60500aad7adbbabcda6840e86ea5363ace7 not found: ID does not exist" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.755462 4636 scope.go:117] "RemoveContainer" containerID="203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3" Oct 03 14:17:49 crc kubenswrapper[4636]: E1003 14:17:49.755712 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3\": container with ID starting with 203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3 not found: ID does not exist" containerID="203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3" Oct 03 14:17:49 crc kubenswrapper[4636]: I1003 14:17:49.755774 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3"} err="failed to get container status \"203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3\": rpc error: code = NotFound desc = could not find container \"203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3\": container with ID starting with 203009d098dbb36ec081a4b70c42a479366d8f1f94c1e6dcf5f466a7b6a93aa3 not found: ID does not exist" Oct 03 14:17:50 crc kubenswrapper[4636]: I1003 14:17:50.515523 4636 generic.go:334] "Generic (PLEG): container finished" podID="a171cfba-a302-4029-ab34-5b7a8f4146d4" containerID="3b3b20ccfcd23fcda9b3081643d0e21ccef7acc175e1f050656c5361184fec2a" exitCode=0 Oct 03 14:17:50 crc kubenswrapper[4636]: I1003 14:17:50.515602 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mfj2-config-lmzq8" event={"ID":"a171cfba-a302-4029-ab34-5b7a8f4146d4","Type":"ContainerDied","Data":"3b3b20ccfcd23fcda9b3081643d0e21ccef7acc175e1f050656c5361184fec2a"} Oct 03 14:17:50 crc kubenswrapper[4636]: I1003 14:17:50.518757 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9017beb0-a89a-4efa-b304-ee0ab7a8ce54","Type":"ContainerStarted","Data":"f99782364c54ebee52f14750b0b8aeafb855548ba1b6914f8caafc914d6967f1"} Oct 03 14:17:50 crc kubenswrapper[4636]: I1003 14:17:50.518808 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9017beb0-a89a-4efa-b304-ee0ab7a8ce54","Type":"ContainerStarted","Data":"bb5bac5c405da8650825a888de75ba2b898b96add25f9a99d62d2f9024ae5eda"} Oct 03 14:17:50 crc kubenswrapper[4636]: I1003 14:17:50.518957 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 03 14:17:50 crc kubenswrapper[4636]: I1003 14:17:50.568279 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.722086406 podStartE2EDuration="4.568258777s" podCreationTimestamp="2025-10-03 14:17:46 +0000 UTC" firstStartedPulling="2025-10-03 14:17:47.935257642 +0000 UTC m=+1017.793983889" lastFinishedPulling="2025-10-03 14:17:49.781430013 +0000 UTC m=+1019.640156260" observedRunningTime="2025-10-03 14:17:50.565833854 +0000 UTC m=+1020.424560101" watchObservedRunningTime="2025-10-03 14:17:50.568258777 +0000 UTC m=+1020.426985024" Oct 03 14:17:50 crc kubenswrapper[4636]: I1003 14:17:50.805645 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" path="/var/lib/kubelet/pods/a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1/volumes" Oct 03 14:17:51 crc kubenswrapper[4636]: I1003 14:17:51.878319 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.018286 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-scripts\") pod \"a171cfba-a302-4029-ab34-5b7a8f4146d4\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.018591 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-log-ovn\") pod \"a171cfba-a302-4029-ab34-5b7a8f4146d4\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.018713 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run-ovn\") pod \"a171cfba-a302-4029-ab34-5b7a8f4146d4\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.018771 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-additional-scripts\") pod \"a171cfba-a302-4029-ab34-5b7a8f4146d4\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.018814 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run\") pod \"a171cfba-a302-4029-ab34-5b7a8f4146d4\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.018893 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxwn7\" (UniqueName: \"kubernetes.io/projected/a171cfba-a302-4029-ab34-5b7a8f4146d4-kube-api-access-wxwn7\") pod \"a171cfba-a302-4029-ab34-5b7a8f4146d4\" (UID: \"a171cfba-a302-4029-ab34-5b7a8f4146d4\") " Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.019006 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a171cfba-a302-4029-ab34-5b7a8f4146d4" (UID: "a171cfba-a302-4029-ab34-5b7a8f4146d4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.019029 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a171cfba-a302-4029-ab34-5b7a8f4146d4" (UID: "a171cfba-a302-4029-ab34-5b7a8f4146d4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.019067 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run" (OuterVolumeSpecName: "var-run") pod "a171cfba-a302-4029-ab34-5b7a8f4146d4" (UID: "a171cfba-a302-4029-ab34-5b7a8f4146d4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.019362 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a171cfba-a302-4029-ab34-5b7a8f4146d4" (UID: "a171cfba-a302-4029-ab34-5b7a8f4146d4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.019516 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-scripts" (OuterVolumeSpecName: "scripts") pod "a171cfba-a302-4029-ab34-5b7a8f4146d4" (UID: "a171cfba-a302-4029-ab34-5b7a8f4146d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.019966 4636 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.019985 4636 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.019994 4636 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.020003 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a171cfba-a302-4029-ab34-5b7a8f4146d4-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.020012 4636 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a171cfba-a302-4029-ab34-5b7a8f4146d4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.027220 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a171cfba-a302-4029-ab34-5b7a8f4146d4-kube-api-access-wxwn7" (OuterVolumeSpecName: "kube-api-access-wxwn7") pod "a171cfba-a302-4029-ab34-5b7a8f4146d4" (UID: "a171cfba-a302-4029-ab34-5b7a8f4146d4"). InnerVolumeSpecName "kube-api-access-wxwn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.120998 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.121151 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxwn7\" (UniqueName: \"kubernetes.io/projected/a171cfba-a302-4029-ab34-5b7a8f4146d4-kube-api-access-wxwn7\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:52 crc kubenswrapper[4636]: E1003 14:17:52.121253 4636 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 14:17:52 crc kubenswrapper[4636]: E1003 14:17:52.121267 4636 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 14:17:52 crc kubenswrapper[4636]: E1003 14:17:52.121307 4636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift podName:201b506e-9cc5-4ab0-9af4-96a357d19f6e nodeName:}" failed. No retries permitted until 2025-10-03 14:18:24.121293642 +0000 UTC m=+1053.980019889 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift") pod "swift-storage-0" (UID: "201b506e-9cc5-4ab0-9af4-96a357d19f6e") : configmap "swift-ring-files" not found Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.560732 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-2mfj2" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.573750 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2mfj2-config-lmzq8" event={"ID":"a171cfba-a302-4029-ab34-5b7a8f4146d4","Type":"ContainerDied","Data":"6b11be083bfb69859b8c971b11eb2009956abbe9c8a1bc027195ab5b3a6cf626"} Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.573800 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b11be083bfb69859b8c971b11eb2009956abbe9c8a1bc027195ab5b3a6cf626" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.573813 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2mfj2-config-lmzq8" Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.984242 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2mfj2-config-lmzq8"] Oct 03 14:17:52 crc kubenswrapper[4636]: I1003 14:17:52.991373 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2mfj2-config-lmzq8"] Oct 03 14:17:53 crc kubenswrapper[4636]: I1003 14:17:53.776379 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 14:17:54 crc kubenswrapper[4636]: I1003 14:17:54.194306 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:17:54 crc kubenswrapper[4636]: I1003 14:17:54.802125 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a171cfba-a302-4029-ab34-5b7a8f4146d4" path="/var/lib/kubelet/pods/a171cfba-a302-4029-ab34-5b7a8f4146d4/volumes" Oct 03 14:17:55 crc kubenswrapper[4636]: I1003 14:17:55.593061 4636 generic.go:334] "Generic (PLEG): container finished" podID="00eeeec0-4e4a-4e2c-aaa6-07a793372fd7" containerID="e8fff478098aff86770ab075a906118add1e2a5e10185f72c8bfd3fd501935af" exitCode=0 Oct 03 14:17:55 crc kubenswrapper[4636]: I1003 14:17:55.593397 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qz8ds" event={"ID":"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7","Type":"ContainerDied","Data":"e8fff478098aff86770ab075a906118add1e2a5e10185f72c8bfd3fd501935af"} Oct 03 14:17:55 crc kubenswrapper[4636]: I1003 14:17:55.953077 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:56 crc kubenswrapper[4636]: I1003 14:17:56.275704 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 03 14:17:56 crc kubenswrapper[4636]: I1003 14:17:56.276060 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 03 14:17:56 crc kubenswrapper[4636]: I1003 14:17:56.321938 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 03 14:17:56 crc kubenswrapper[4636]: I1003 14:17:56.658953 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 03 14:17:56 crc kubenswrapper[4636]: I1003 14:17:56.809319 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:17:56 crc kubenswrapper[4636]: I1003 14:17:56.912358 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-z2z7s"] Oct 03 14:17:56 crc kubenswrapper[4636]: I1003 14:17:56.917579 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" podUID="d73f8680-f04e-4b8f-9a56-c0a4921e950a" containerName="dnsmasq-dns" containerID="cri-o://9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98" gracePeriod=10 Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.026376 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.103312 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj2tb\" (UniqueName: \"kubernetes.io/projected/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-kube-api-access-zj2tb\") pod \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.103390 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-ring-data-devices\") pod \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.103488 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-combined-ca-bundle\") pod \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.103557 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-swiftconf\") pod \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.103614 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-scripts\") pod \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.103729 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-etc-swift\") pod \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.103762 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-dispersionconf\") pod \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\" (UID: \"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7\") " Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.104788 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7" (UID: "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.105623 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7" (UID: "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.125863 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-kube-api-access-zj2tb" (OuterVolumeSpecName: "kube-api-access-zj2tb") pod "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7" (UID: "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7"). InnerVolumeSpecName "kube-api-access-zj2tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.143846 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7" (UID: "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.175077 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-scripts" (OuterVolumeSpecName: "scripts") pod "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7" (UID: "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.184627 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7" (UID: "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.186784 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7" (UID: "00eeeec0-4e4a-4e2c-aaa6-07a793372fd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.206129 4636 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.206159 4636 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.206171 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj2tb\" (UniqueName: \"kubernetes.io/projected/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-kube-api-access-zj2tb\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.206205 4636 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.206216 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.206225 4636 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.206236 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00eeeec0-4e4a-4e2c-aaa6-07a793372fd7-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.227278 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-4pz7d"] Oct 03 14:17:57 crc kubenswrapper[4636]: E1003 14:17:57.227735 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00eeeec0-4e4a-4e2c-aaa6-07a793372fd7" containerName="swift-ring-rebalance" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.227752 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="00eeeec0-4e4a-4e2c-aaa6-07a793372fd7" containerName="swift-ring-rebalance" Oct 03 14:17:57 crc kubenswrapper[4636]: E1003 14:17:57.227771 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" containerName="init" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.227779 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" containerName="init" Oct 03 14:17:57 crc kubenswrapper[4636]: E1003 14:17:57.227809 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a171cfba-a302-4029-ab34-5b7a8f4146d4" containerName="ovn-config" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.227818 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a171cfba-a302-4029-ab34-5b7a8f4146d4" containerName="ovn-config" Oct 03 14:17:57 crc kubenswrapper[4636]: E1003 14:17:57.227837 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" containerName="dnsmasq-dns" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.227845 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" containerName="dnsmasq-dns" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.228050 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="a171cfba-a302-4029-ab34-5b7a8f4146d4" containerName="ovn-config" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.228074 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9fd0173-c3c4-42e1-9793-1cdc65f3dbe1" containerName="dnsmasq-dns" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.228089 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="00eeeec0-4e4a-4e2c-aaa6-07a793372fd7" containerName="swift-ring-rebalance" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.228738 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4pz7d" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.233247 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4pz7d"] Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.307759 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw9pj\" (UniqueName: \"kubernetes.io/projected/316899b4-4f1f-4065-ae6b-fddfa3c90ab6-kube-api-access-kw9pj\") pod \"placement-db-create-4pz7d\" (UID: \"316899b4-4f1f-4065-ae6b-fddfa3c90ab6\") " pod="openstack/placement-db-create-4pz7d" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.371985 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.409263 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-ovsdbserver-nb\") pod \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.409340 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-dns-svc\") pod \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.409468 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-config\") pod \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.409489 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjrgq\" (UniqueName: \"kubernetes.io/projected/d73f8680-f04e-4b8f-9a56-c0a4921e950a-kube-api-access-vjrgq\") pod \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\" (UID: \"d73f8680-f04e-4b8f-9a56-c0a4921e950a\") " Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.409703 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw9pj\" (UniqueName: \"kubernetes.io/projected/316899b4-4f1f-4065-ae6b-fddfa3c90ab6-kube-api-access-kw9pj\") pod \"placement-db-create-4pz7d\" (UID: \"316899b4-4f1f-4065-ae6b-fddfa3c90ab6\") " pod="openstack/placement-db-create-4pz7d" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.418745 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73f8680-f04e-4b8f-9a56-c0a4921e950a-kube-api-access-vjrgq" (OuterVolumeSpecName: "kube-api-access-vjrgq") pod "d73f8680-f04e-4b8f-9a56-c0a4921e950a" (UID: "d73f8680-f04e-4b8f-9a56-c0a4921e950a"). InnerVolumeSpecName "kube-api-access-vjrgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.437117 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw9pj\" (UniqueName: \"kubernetes.io/projected/316899b4-4f1f-4065-ae6b-fddfa3c90ab6-kube-api-access-kw9pj\") pod \"placement-db-create-4pz7d\" (UID: \"316899b4-4f1f-4065-ae6b-fddfa3c90ab6\") " pod="openstack/placement-db-create-4pz7d" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.459867 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d73f8680-f04e-4b8f-9a56-c0a4921e950a" (UID: "d73f8680-f04e-4b8f-9a56-c0a4921e950a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.461400 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-config" (OuterVolumeSpecName: "config") pod "d73f8680-f04e-4b8f-9a56-c0a4921e950a" (UID: "d73f8680-f04e-4b8f-9a56-c0a4921e950a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.462234 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d73f8680-f04e-4b8f-9a56-c0a4921e950a" (UID: "d73f8680-f04e-4b8f-9a56-c0a4921e950a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.511676 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.511710 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjrgq\" (UniqueName: \"kubernetes.io/projected/d73f8680-f04e-4b8f-9a56-c0a4921e950a-kube-api-access-vjrgq\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.511722 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.511731 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73f8680-f04e-4b8f-9a56-c0a4921e950a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.548427 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4pz7d" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.618354 4636 generic.go:334] "Generic (PLEG): container finished" podID="d73f8680-f04e-4b8f-9a56-c0a4921e950a" containerID="9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98" exitCode=0 Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.618457 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.619028 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" event={"ID":"d73f8680-f04e-4b8f-9a56-c0a4921e950a","Type":"ContainerDied","Data":"9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98"} Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.619107 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-z2z7s" event={"ID":"d73f8680-f04e-4b8f-9a56-c0a4921e950a","Type":"ContainerDied","Data":"3ea84417305aad313dafe7c485d49426148c971769c80c6fbd99e6a9603ae116"} Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.619132 4636 scope.go:117] "RemoveContainer" containerID="9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.622724 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qz8ds" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.622749 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qz8ds" event={"ID":"00eeeec0-4e4a-4e2c-aaa6-07a793372fd7","Type":"ContainerDied","Data":"1f5f2f2a43f22f1751970aecb30772b6aec59d9ac2f962da4b38c292a5a07ea1"} Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.622787 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f5f2f2a43f22f1751970aecb30772b6aec59d9ac2f962da4b38c292a5a07ea1" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.641438 4636 scope.go:117] "RemoveContainer" containerID="22283cf851146ffe37855d0e1e7fd4f60bf0acb825bb87a5691a54e864985b54" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.661663 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-z2z7s"] Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.670576 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-z2z7s"] Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.709575 4636 scope.go:117] "RemoveContainer" containerID="9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98" Oct 03 14:17:57 crc kubenswrapper[4636]: E1003 14:17:57.710032 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98\": container with ID starting with 9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98 not found: ID does not exist" containerID="9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.710065 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98"} err="failed to get container status \"9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98\": rpc error: code = NotFound desc = could not find container \"9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98\": container with ID starting with 9c20c8a8e33de1b1ab7b809bb5097eb1165a7cefb109e87081c795aac950ca98 not found: ID does not exist" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.710373 4636 scope.go:117] "RemoveContainer" containerID="22283cf851146ffe37855d0e1e7fd4f60bf0acb825bb87a5691a54e864985b54" Oct 03 14:17:57 crc kubenswrapper[4636]: E1003 14:17:57.710576 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22283cf851146ffe37855d0e1e7fd4f60bf0acb825bb87a5691a54e864985b54\": container with ID starting with 22283cf851146ffe37855d0e1e7fd4f60bf0acb825bb87a5691a54e864985b54 not found: ID does not exist" containerID="22283cf851146ffe37855d0e1e7fd4f60bf0acb825bb87a5691a54e864985b54" Oct 03 14:17:57 crc kubenswrapper[4636]: I1003 14:17:57.710598 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22283cf851146ffe37855d0e1e7fd4f60bf0acb825bb87a5691a54e864985b54"} err="failed to get container status \"22283cf851146ffe37855d0e1e7fd4f60bf0acb825bb87a5691a54e864985b54\": rpc error: code = NotFound desc = could not find container \"22283cf851146ffe37855d0e1e7fd4f60bf0acb825bb87a5691a54e864985b54\": container with ID starting with 22283cf851146ffe37855d0e1e7fd4f60bf0acb825bb87a5691a54e864985b54 not found: ID does not exist" Oct 03 14:17:58 crc kubenswrapper[4636]: I1003 14:17:58.008586 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4pz7d"] Oct 03 14:17:58 crc kubenswrapper[4636]: W1003 14:17:58.014703 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod316899b4_4f1f_4065_ae6b_fddfa3c90ab6.slice/crio-2885b27fe54a7d1adb234d35cb636d4ef03d2271fe8f6eba75e412bbe6631ffc WatchSource:0}: Error finding container 2885b27fe54a7d1adb234d35cb636d4ef03d2271fe8f6eba75e412bbe6631ffc: Status 404 returned error can't find the container with id 2885b27fe54a7d1adb234d35cb636d4ef03d2271fe8f6eba75e412bbe6631ffc Oct 03 14:17:58 crc kubenswrapper[4636]: I1003 14:17:58.633255 4636 generic.go:334] "Generic (PLEG): container finished" podID="316899b4-4f1f-4065-ae6b-fddfa3c90ab6" containerID="7427676545f7fbcc148fe269987c5a6a638bc3eff511865c0ddc8424c989e573" exitCode=0 Oct 03 14:17:58 crc kubenswrapper[4636]: I1003 14:17:58.633299 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4pz7d" event={"ID":"316899b4-4f1f-4065-ae6b-fddfa3c90ab6","Type":"ContainerDied","Data":"7427676545f7fbcc148fe269987c5a6a638bc3eff511865c0ddc8424c989e573"} Oct 03 14:17:58 crc kubenswrapper[4636]: I1003 14:17:58.633333 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4pz7d" event={"ID":"316899b4-4f1f-4065-ae6b-fddfa3c90ab6","Type":"ContainerStarted","Data":"2885b27fe54a7d1adb234d35cb636d4ef03d2271fe8f6eba75e412bbe6631ffc"} Oct 03 14:17:58 crc kubenswrapper[4636]: I1003 14:17:58.801967 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d73f8680-f04e-4b8f-9a56-c0a4921e950a" path="/var/lib/kubelet/pods/d73f8680-f04e-4b8f-9a56-c0a4921e950a/volumes" Oct 03 14:17:59 crc kubenswrapper[4636]: I1003 14:17:59.915460 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4pz7d" Oct 03 14:17:59 crc kubenswrapper[4636]: I1003 14:17:59.954846 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw9pj\" (UniqueName: \"kubernetes.io/projected/316899b4-4f1f-4065-ae6b-fddfa3c90ab6-kube-api-access-kw9pj\") pod \"316899b4-4f1f-4065-ae6b-fddfa3c90ab6\" (UID: \"316899b4-4f1f-4065-ae6b-fddfa3c90ab6\") " Oct 03 14:17:59 crc kubenswrapper[4636]: I1003 14:17:59.963340 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316899b4-4f1f-4065-ae6b-fddfa3c90ab6-kube-api-access-kw9pj" (OuterVolumeSpecName: "kube-api-access-kw9pj") pod "316899b4-4f1f-4065-ae6b-fddfa3c90ab6" (UID: "316899b4-4f1f-4065-ae6b-fddfa3c90ab6"). InnerVolumeSpecName "kube-api-access-kw9pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:00 crc kubenswrapper[4636]: I1003 14:18:00.058015 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw9pj\" (UniqueName: \"kubernetes.io/projected/316899b4-4f1f-4065-ae6b-fddfa3c90ab6-kube-api-access-kw9pj\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:00 crc kubenswrapper[4636]: I1003 14:18:00.650133 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4pz7d" event={"ID":"316899b4-4f1f-4065-ae6b-fddfa3c90ab6","Type":"ContainerDied","Data":"2885b27fe54a7d1adb234d35cb636d4ef03d2271fe8f6eba75e412bbe6631ffc"} Oct 03 14:18:00 crc kubenswrapper[4636]: I1003 14:18:00.650178 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2885b27fe54a7d1adb234d35cb636d4ef03d2271fe8f6eba75e412bbe6631ffc" Oct 03 14:18:00 crc kubenswrapper[4636]: I1003 14:18:00.650178 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4pz7d" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.323186 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.409775 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8zhsb"] Oct 03 14:18:02 crc kubenswrapper[4636]: E1003 14:18:02.410605 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73f8680-f04e-4b8f-9a56-c0a4921e950a" containerName="dnsmasq-dns" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.410631 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73f8680-f04e-4b8f-9a56-c0a4921e950a" containerName="dnsmasq-dns" Oct 03 14:18:02 crc kubenswrapper[4636]: E1003 14:18:02.410666 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73f8680-f04e-4b8f-9a56-c0a4921e950a" containerName="init" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.410676 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73f8680-f04e-4b8f-9a56-c0a4921e950a" containerName="init" Oct 03 14:18:02 crc kubenswrapper[4636]: E1003 14:18:02.410698 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316899b4-4f1f-4065-ae6b-fddfa3c90ab6" containerName="mariadb-database-create" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.410706 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="316899b4-4f1f-4065-ae6b-fddfa3c90ab6" containerName="mariadb-database-create" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.410890 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="d73f8680-f04e-4b8f-9a56-c0a4921e950a" containerName="dnsmasq-dns" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.410919 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="316899b4-4f1f-4065-ae6b-fddfa3c90ab6" containerName="mariadb-database-create" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.411614 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8zhsb" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.428522 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8zhsb"] Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.495995 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4djm\" (UniqueName: \"kubernetes.io/projected/4b04e574-4d75-478a-a55f-486aab465fa7-kube-api-access-d4djm\") pod \"glance-db-create-8zhsb\" (UID: \"4b04e574-4d75-478a-a55f-486aab465fa7\") " pod="openstack/glance-db-create-8zhsb" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.597546 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4djm\" (UniqueName: \"kubernetes.io/projected/4b04e574-4d75-478a-a55f-486aab465fa7-kube-api-access-d4djm\") pod \"glance-db-create-8zhsb\" (UID: \"4b04e574-4d75-478a-a55f-486aab465fa7\") " pod="openstack/glance-db-create-8zhsb" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.618333 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4djm\" (UniqueName: \"kubernetes.io/projected/4b04e574-4d75-478a-a55f-486aab465fa7-kube-api-access-d4djm\") pod \"glance-db-create-8zhsb\" (UID: \"4b04e574-4d75-478a-a55f-486aab465fa7\") " pod="openstack/glance-db-create-8zhsb" Oct 03 14:18:02 crc kubenswrapper[4636]: I1003 14:18:02.741187 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8zhsb" Oct 03 14:18:03 crc kubenswrapper[4636]: I1003 14:18:03.192672 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8zhsb"] Oct 03 14:18:03 crc kubenswrapper[4636]: E1003 14:18:03.515572 4636 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b04e574_4d75_478a_a55f_486aab465fa7.slice/crio-4fc67d8ab8602e125956c304c66ab466454a22bbcb767f4de347fe516cd3e915.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b04e574_4d75_478a_a55f_486aab465fa7.slice/crio-conmon-4fc67d8ab8602e125956c304c66ab466454a22bbcb767f4de347fe516cd3e915.scope\": RecentStats: unable to find data in memory cache]" Oct 03 14:18:03 crc kubenswrapper[4636]: I1003 14:18:03.673370 4636 generic.go:334] "Generic (PLEG): container finished" podID="4b04e574-4d75-478a-a55f-486aab465fa7" containerID="4fc67d8ab8602e125956c304c66ab466454a22bbcb767f4de347fe516cd3e915" exitCode=0 Oct 03 14:18:03 crc kubenswrapper[4636]: I1003 14:18:03.673468 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8zhsb" event={"ID":"4b04e574-4d75-478a-a55f-486aab465fa7","Type":"ContainerDied","Data":"4fc67d8ab8602e125956c304c66ab466454a22bbcb767f4de347fe516cd3e915"} Oct 03 14:18:03 crc kubenswrapper[4636]: I1003 14:18:03.673721 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8zhsb" event={"ID":"4b04e574-4d75-478a-a55f-486aab465fa7","Type":"ContainerStarted","Data":"9656ddf77f406ef127643a7275f7b7743cef0232e92938af5fa16b33db60fad1"} Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.085527 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-zh4x6"] Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.090925 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zh4x6" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.093410 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zh4x6"] Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.119502 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jvp5\" (UniqueName: \"kubernetes.io/projected/cba05788-5cbc-43bf-90a3-16dd333267d6-kube-api-access-2jvp5\") pod \"barbican-db-create-zh4x6\" (UID: \"cba05788-5cbc-43bf-90a3-16dd333267d6\") " pod="openstack/barbican-db-create-zh4x6" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.186679 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-pbmsg"] Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.187769 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pbmsg" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.192522 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pbmsg"] Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.221074 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jvp5\" (UniqueName: \"kubernetes.io/projected/cba05788-5cbc-43bf-90a3-16dd333267d6-kube-api-access-2jvp5\") pod \"barbican-db-create-zh4x6\" (UID: \"cba05788-5cbc-43bf-90a3-16dd333267d6\") " pod="openstack/barbican-db-create-zh4x6" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.237723 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jvp5\" (UniqueName: \"kubernetes.io/projected/cba05788-5cbc-43bf-90a3-16dd333267d6-kube-api-access-2jvp5\") pod \"barbican-db-create-zh4x6\" (UID: \"cba05788-5cbc-43bf-90a3-16dd333267d6\") " pod="openstack/barbican-db-create-zh4x6" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.291165 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-jjmrv"] Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.292303 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jjmrv" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.299171 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jjmrv"] Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.331536 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfdkk\" (UniqueName: \"kubernetes.io/projected/39407704-a90e-4ea4-a39b-1ec109994c04-kube-api-access-pfdkk\") pod \"neutron-db-create-jjmrv\" (UID: \"39407704-a90e-4ea4-a39b-1ec109994c04\") " pod="openstack/neutron-db-create-jjmrv" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.331635 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksb7p\" (UniqueName: \"kubernetes.io/projected/15d1351d-7b6e-4ced-b207-5ec41477a9a6-kube-api-access-ksb7p\") pod \"cinder-db-create-pbmsg\" (UID: \"15d1351d-7b6e-4ced-b207-5ec41477a9a6\") " pod="openstack/cinder-db-create-pbmsg" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.411308 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zh4x6" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.437437 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfdkk\" (UniqueName: \"kubernetes.io/projected/39407704-a90e-4ea4-a39b-1ec109994c04-kube-api-access-pfdkk\") pod \"neutron-db-create-jjmrv\" (UID: \"39407704-a90e-4ea4-a39b-1ec109994c04\") " pod="openstack/neutron-db-create-jjmrv" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.437500 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksb7p\" (UniqueName: \"kubernetes.io/projected/15d1351d-7b6e-4ced-b207-5ec41477a9a6-kube-api-access-ksb7p\") pod \"cinder-db-create-pbmsg\" (UID: \"15d1351d-7b6e-4ced-b207-5ec41477a9a6\") " pod="openstack/cinder-db-create-pbmsg" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.455693 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfdkk\" (UniqueName: \"kubernetes.io/projected/39407704-a90e-4ea4-a39b-1ec109994c04-kube-api-access-pfdkk\") pod \"neutron-db-create-jjmrv\" (UID: \"39407704-a90e-4ea4-a39b-1ec109994c04\") " pod="openstack/neutron-db-create-jjmrv" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.456421 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksb7p\" (UniqueName: \"kubernetes.io/projected/15d1351d-7b6e-4ced-b207-5ec41477a9a6-kube-api-access-ksb7p\") pod \"cinder-db-create-pbmsg\" (UID: \"15d1351d-7b6e-4ced-b207-5ec41477a9a6\") " pod="openstack/cinder-db-create-pbmsg" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.501580 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pbmsg" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.633811 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jjmrv" Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.676575 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zh4x6"] Oct 03 14:18:04 crc kubenswrapper[4636]: W1003 14:18:04.695955 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba05788_5cbc_43bf_90a3_16dd333267d6.slice/crio-8b5fd2a4f9b28285db31e3fc58622d7112a6f04af7df566d96f44672de4f2efc WatchSource:0}: Error finding container 8b5fd2a4f9b28285db31e3fc58622d7112a6f04af7df566d96f44672de4f2efc: Status 404 returned error can't find the container with id 8b5fd2a4f9b28285db31e3fc58622d7112a6f04af7df566d96f44672de4f2efc Oct 03 14:18:04 crc kubenswrapper[4636]: I1003 14:18:04.994614 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8zhsb" Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:04.997899 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pbmsg"] Oct 03 14:18:05 crc kubenswrapper[4636]: W1003 14:18:04.998753 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15d1351d_7b6e_4ced_b207_5ec41477a9a6.slice/crio-779abb9f079ccf743bde36968386f98a24bd59c31cd8f136e40638e54b17f373 WatchSource:0}: Error finding container 779abb9f079ccf743bde36968386f98a24bd59c31cd8f136e40638e54b17f373: Status 404 returned error can't find the container with id 779abb9f079ccf743bde36968386f98a24bd59c31cd8f136e40638e54b17f373 Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.156811 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4djm\" (UniqueName: \"kubernetes.io/projected/4b04e574-4d75-478a-a55f-486aab465fa7-kube-api-access-d4djm\") pod \"4b04e574-4d75-478a-a55f-486aab465fa7\" (UID: \"4b04e574-4d75-478a-a55f-486aab465fa7\") " Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.167817 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jjmrv"] Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.169707 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b04e574-4d75-478a-a55f-486aab465fa7-kube-api-access-d4djm" (OuterVolumeSpecName: "kube-api-access-d4djm") pod "4b04e574-4d75-478a-a55f-486aab465fa7" (UID: "4b04e574-4d75-478a-a55f-486aab465fa7"). InnerVolumeSpecName "kube-api-access-d4djm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:05 crc kubenswrapper[4636]: W1003 14:18:05.174812 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39407704_a90e_4ea4_a39b_1ec109994c04.slice/crio-8af33cc74c2d3131db2e24c9a2fcb52f7ba58b5d1df3c86fc9b6efffe70c8070 WatchSource:0}: Error finding container 8af33cc74c2d3131db2e24c9a2fcb52f7ba58b5d1df3c86fc9b6efffe70c8070: Status 404 returned error can't find the container with id 8af33cc74c2d3131db2e24c9a2fcb52f7ba58b5d1df3c86fc9b6efffe70c8070 Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.259382 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4djm\" (UniqueName: \"kubernetes.io/projected/4b04e574-4d75-478a-a55f-486aab465fa7-kube-api-access-d4djm\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.696484 4636 generic.go:334] "Generic (PLEG): container finished" podID="15d1351d-7b6e-4ced-b207-5ec41477a9a6" containerID="095eee19630799318fdc6318ba4bb43e071d907e171920f99aa03e47055d3d9f" exitCode=0 Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.696565 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pbmsg" event={"ID":"15d1351d-7b6e-4ced-b207-5ec41477a9a6","Type":"ContainerDied","Data":"095eee19630799318fdc6318ba4bb43e071d907e171920f99aa03e47055d3d9f"} Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.696711 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pbmsg" event={"ID":"15d1351d-7b6e-4ced-b207-5ec41477a9a6","Type":"ContainerStarted","Data":"779abb9f079ccf743bde36968386f98a24bd59c31cd8f136e40638e54b17f373"} Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.698356 4636 generic.go:334] "Generic (PLEG): container finished" podID="cba05788-5cbc-43bf-90a3-16dd333267d6" containerID="1445179e9326a4af4ecda546a09280bb29c9007e099827700aa55b99a765e574" exitCode=0 Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.698434 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zh4x6" event={"ID":"cba05788-5cbc-43bf-90a3-16dd333267d6","Type":"ContainerDied","Data":"1445179e9326a4af4ecda546a09280bb29c9007e099827700aa55b99a765e574"} Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.698464 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zh4x6" event={"ID":"cba05788-5cbc-43bf-90a3-16dd333267d6","Type":"ContainerStarted","Data":"8b5fd2a4f9b28285db31e3fc58622d7112a6f04af7df566d96f44672de4f2efc"} Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.700734 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8zhsb" event={"ID":"4b04e574-4d75-478a-a55f-486aab465fa7","Type":"ContainerDied","Data":"9656ddf77f406ef127643a7275f7b7743cef0232e92938af5fa16b33db60fad1"} Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.700774 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9656ddf77f406ef127643a7275f7b7743cef0232e92938af5fa16b33db60fad1" Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.700843 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8zhsb" Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.703019 4636 generic.go:334] "Generic (PLEG): container finished" podID="39407704-a90e-4ea4-a39b-1ec109994c04" containerID="474118c6ef44d0a9437fd488c1de35bb8a64721c2e4f794114bd97ead892e0bb" exitCode=0 Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.703072 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jjmrv" event={"ID":"39407704-a90e-4ea4-a39b-1ec109994c04","Type":"ContainerDied","Data":"474118c6ef44d0a9437fd488c1de35bb8a64721c2e4f794114bd97ead892e0bb"} Oct 03 14:18:05 crc kubenswrapper[4636]: I1003 14:18:05.703116 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jjmrv" event={"ID":"39407704-a90e-4ea4-a39b-1ec109994c04","Type":"ContainerStarted","Data":"8af33cc74c2d3131db2e24c9a2fcb52f7ba58b5d1df3c86fc9b6efffe70c8070"} Oct 03 14:18:06 crc kubenswrapper[4636]: I1003 14:18:06.757190 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-9bncb"] Oct 03 14:18:06 crc kubenswrapper[4636]: E1003 14:18:06.757811 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b04e574-4d75-478a-a55f-486aab465fa7" containerName="mariadb-database-create" Oct 03 14:18:06 crc kubenswrapper[4636]: I1003 14:18:06.757829 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b04e574-4d75-478a-a55f-486aab465fa7" containerName="mariadb-database-create" Oct 03 14:18:06 crc kubenswrapper[4636]: I1003 14:18:06.758037 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b04e574-4d75-478a-a55f-486aab465fa7" containerName="mariadb-database-create" Oct 03 14:18:06 crc kubenswrapper[4636]: I1003 14:18:06.758632 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9bncb" Oct 03 14:18:06 crc kubenswrapper[4636]: I1003 14:18:06.783501 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntmbw\" (UniqueName: \"kubernetes.io/projected/8f968a95-a0b1-4f56-886b-64674656f645-kube-api-access-ntmbw\") pod \"keystone-db-create-9bncb\" (UID: \"8f968a95-a0b1-4f56-886b-64674656f645\") " pod="openstack/keystone-db-create-9bncb" Oct 03 14:18:06 crc kubenswrapper[4636]: I1003 14:18:06.814356 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9bncb"] Oct 03 14:18:06 crc kubenswrapper[4636]: I1003 14:18:06.884512 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntmbw\" (UniqueName: \"kubernetes.io/projected/8f968a95-a0b1-4f56-886b-64674656f645-kube-api-access-ntmbw\") pod \"keystone-db-create-9bncb\" (UID: \"8f968a95-a0b1-4f56-886b-64674656f645\") " pod="openstack/keystone-db-create-9bncb" Oct 03 14:18:06 crc kubenswrapper[4636]: I1003 14:18:06.915980 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntmbw\" (UniqueName: \"kubernetes.io/projected/8f968a95-a0b1-4f56-886b-64674656f645-kube-api-access-ntmbw\") pod \"keystone-db-create-9bncb\" (UID: \"8f968a95-a0b1-4f56-886b-64674656f645\") " pod="openstack/keystone-db-create-9bncb" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.074610 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9bncb" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.086285 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pbmsg" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.088383 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksb7p\" (UniqueName: \"kubernetes.io/projected/15d1351d-7b6e-4ced-b207-5ec41477a9a6-kube-api-access-ksb7p\") pod \"15d1351d-7b6e-4ced-b207-5ec41477a9a6\" (UID: \"15d1351d-7b6e-4ced-b207-5ec41477a9a6\") " Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.105408 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d1351d-7b6e-4ced-b207-5ec41477a9a6-kube-api-access-ksb7p" (OuterVolumeSpecName: "kube-api-access-ksb7p") pod "15d1351d-7b6e-4ced-b207-5ec41477a9a6" (UID: "15d1351d-7b6e-4ced-b207-5ec41477a9a6"). InnerVolumeSpecName "kube-api-access-ksb7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.189718 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksb7p\" (UniqueName: \"kubernetes.io/projected/15d1351d-7b6e-4ced-b207-5ec41477a9a6-kube-api-access-ksb7p\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.191056 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zh4x6" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.196528 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jjmrv" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.232294 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8d56-account-create-m645b"] Oct 03 14:18:07 crc kubenswrapper[4636]: E1003 14:18:07.232657 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39407704-a90e-4ea4-a39b-1ec109994c04" containerName="mariadb-database-create" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.232672 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="39407704-a90e-4ea4-a39b-1ec109994c04" containerName="mariadb-database-create" Oct 03 14:18:07 crc kubenswrapper[4636]: E1003 14:18:07.232702 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba05788-5cbc-43bf-90a3-16dd333267d6" containerName="mariadb-database-create" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.232710 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba05788-5cbc-43bf-90a3-16dd333267d6" containerName="mariadb-database-create" Oct 03 14:18:07 crc kubenswrapper[4636]: E1003 14:18:07.232719 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d1351d-7b6e-4ced-b207-5ec41477a9a6" containerName="mariadb-database-create" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.232724 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d1351d-7b6e-4ced-b207-5ec41477a9a6" containerName="mariadb-database-create" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.232865 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="39407704-a90e-4ea4-a39b-1ec109994c04" containerName="mariadb-database-create" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.232880 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d1351d-7b6e-4ced-b207-5ec41477a9a6" containerName="mariadb-database-create" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.232890 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba05788-5cbc-43bf-90a3-16dd333267d6" containerName="mariadb-database-create" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.233734 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d56-account-create-m645b" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.235882 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.261185 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d56-account-create-m645b"] Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.290914 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jvp5\" (UniqueName: \"kubernetes.io/projected/cba05788-5cbc-43bf-90a3-16dd333267d6-kube-api-access-2jvp5\") pod \"cba05788-5cbc-43bf-90a3-16dd333267d6\" (UID: \"cba05788-5cbc-43bf-90a3-16dd333267d6\") " Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.291027 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfdkk\" (UniqueName: \"kubernetes.io/projected/39407704-a90e-4ea4-a39b-1ec109994c04-kube-api-access-pfdkk\") pod \"39407704-a90e-4ea4-a39b-1ec109994c04\" (UID: \"39407704-a90e-4ea4-a39b-1ec109994c04\") " Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.291402 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhd8j\" (UniqueName: \"kubernetes.io/projected/8dfd65d1-e6ef-4646-a546-6a03d0443231-kube-api-access-jhd8j\") pod \"placement-8d56-account-create-m645b\" (UID: \"8dfd65d1-e6ef-4646-a546-6a03d0443231\") " pod="openstack/placement-8d56-account-create-m645b" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.298811 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba05788-5cbc-43bf-90a3-16dd333267d6-kube-api-access-2jvp5" (OuterVolumeSpecName: "kube-api-access-2jvp5") pod "cba05788-5cbc-43bf-90a3-16dd333267d6" (UID: "cba05788-5cbc-43bf-90a3-16dd333267d6"). InnerVolumeSpecName "kube-api-access-2jvp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.300483 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39407704-a90e-4ea4-a39b-1ec109994c04-kube-api-access-pfdkk" (OuterVolumeSpecName: "kube-api-access-pfdkk") pod "39407704-a90e-4ea4-a39b-1ec109994c04" (UID: "39407704-a90e-4ea4-a39b-1ec109994c04"). InnerVolumeSpecName "kube-api-access-pfdkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.392665 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhd8j\" (UniqueName: \"kubernetes.io/projected/8dfd65d1-e6ef-4646-a546-6a03d0443231-kube-api-access-jhd8j\") pod \"placement-8d56-account-create-m645b\" (UID: \"8dfd65d1-e6ef-4646-a546-6a03d0443231\") " pod="openstack/placement-8d56-account-create-m645b" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.393130 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jvp5\" (UniqueName: \"kubernetes.io/projected/cba05788-5cbc-43bf-90a3-16dd333267d6-kube-api-access-2jvp5\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.393152 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfdkk\" (UniqueName: \"kubernetes.io/projected/39407704-a90e-4ea4-a39b-1ec109994c04-kube-api-access-pfdkk\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.408969 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhd8j\" (UniqueName: \"kubernetes.io/projected/8dfd65d1-e6ef-4646-a546-6a03d0443231-kube-api-access-jhd8j\") pod \"placement-8d56-account-create-m645b\" (UID: \"8dfd65d1-e6ef-4646-a546-6a03d0443231\") " pod="openstack/placement-8d56-account-create-m645b" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.551085 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9bncb"] Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.561419 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d56-account-create-m645b" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.733976 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9bncb" event={"ID":"8f968a95-a0b1-4f56-886b-64674656f645","Type":"ContainerStarted","Data":"b794f9330ed0b7b3e42e013f4a8b6940c8c4e6405986c9288dd177feee1d69ed"} Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.737297 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jjmrv" event={"ID":"39407704-a90e-4ea4-a39b-1ec109994c04","Type":"ContainerDied","Data":"8af33cc74c2d3131db2e24c9a2fcb52f7ba58b5d1df3c86fc9b6efffe70c8070"} Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.737329 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8af33cc74c2d3131db2e24c9a2fcb52f7ba58b5d1df3c86fc9b6efffe70c8070" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.737439 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jjmrv" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.743712 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pbmsg" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.743714 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pbmsg" event={"ID":"15d1351d-7b6e-4ced-b207-5ec41477a9a6","Type":"ContainerDied","Data":"779abb9f079ccf743bde36968386f98a24bd59c31cd8f136e40638e54b17f373"} Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.743854 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="779abb9f079ccf743bde36968386f98a24bd59c31cd8f136e40638e54b17f373" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.771626 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zh4x6" event={"ID":"cba05788-5cbc-43bf-90a3-16dd333267d6","Type":"ContainerDied","Data":"8b5fd2a4f9b28285db31e3fc58622d7112a6f04af7df566d96f44672de4f2efc"} Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.771660 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b5fd2a4f9b28285db31e3fc58622d7112a6f04af7df566d96f44672de4f2efc" Oct 03 14:18:07 crc kubenswrapper[4636]: I1003 14:18:07.771710 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zh4x6" Oct 03 14:18:08 crc kubenswrapper[4636]: I1003 14:18:08.046590 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d56-account-create-m645b"] Oct 03 14:18:08 crc kubenswrapper[4636]: W1003 14:18:08.052178 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dfd65d1_e6ef_4646_a546_6a03d0443231.slice/crio-eee7a7b7c73f248bea4ec6e1644b92a8e4fca435362dae12027248b051a3be5a WatchSource:0}: Error finding container eee7a7b7c73f248bea4ec6e1644b92a8e4fca435362dae12027248b051a3be5a: Status 404 returned error can't find the container with id eee7a7b7c73f248bea4ec6e1644b92a8e4fca435362dae12027248b051a3be5a Oct 03 14:18:08 crc kubenswrapper[4636]: I1003 14:18:08.779032 4636 generic.go:334] "Generic (PLEG): container finished" podID="8dfd65d1-e6ef-4646-a546-6a03d0443231" containerID="27a71dd2504c82195b4fdb68f9a984a42f8a81f8be983e256d3c34ec08fccce1" exitCode=0 Oct 03 14:18:08 crc kubenswrapper[4636]: I1003 14:18:08.779089 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d56-account-create-m645b" event={"ID":"8dfd65d1-e6ef-4646-a546-6a03d0443231","Type":"ContainerDied","Data":"27a71dd2504c82195b4fdb68f9a984a42f8a81f8be983e256d3c34ec08fccce1"} Oct 03 14:18:08 crc kubenswrapper[4636]: I1003 14:18:08.779136 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d56-account-create-m645b" event={"ID":"8dfd65d1-e6ef-4646-a546-6a03d0443231","Type":"ContainerStarted","Data":"eee7a7b7c73f248bea4ec6e1644b92a8e4fca435362dae12027248b051a3be5a"} Oct 03 14:18:08 crc kubenswrapper[4636]: I1003 14:18:08.780456 4636 generic.go:334] "Generic (PLEG): container finished" podID="8f968a95-a0b1-4f56-886b-64674656f645" containerID="b6b0bd47b8175ddb56dec171e0ed222e997d0e28fdae220abee7b845e3fe99e6" exitCode=0 Oct 03 14:18:08 crc kubenswrapper[4636]: I1003 14:18:08.780485 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9bncb" event={"ID":"8f968a95-a0b1-4f56-886b-64674656f645","Type":"ContainerDied","Data":"b6b0bd47b8175ddb56dec171e0ed222e997d0e28fdae220abee7b845e3fe99e6"} Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.179819 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d56-account-create-m645b" Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.186825 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9bncb" Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.245542 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhd8j\" (UniqueName: \"kubernetes.io/projected/8dfd65d1-e6ef-4646-a546-6a03d0443231-kube-api-access-jhd8j\") pod \"8dfd65d1-e6ef-4646-a546-6a03d0443231\" (UID: \"8dfd65d1-e6ef-4646-a546-6a03d0443231\") " Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.245756 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntmbw\" (UniqueName: \"kubernetes.io/projected/8f968a95-a0b1-4f56-886b-64674656f645-kube-api-access-ntmbw\") pod \"8f968a95-a0b1-4f56-886b-64674656f645\" (UID: \"8f968a95-a0b1-4f56-886b-64674656f645\") " Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.252364 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfd65d1-e6ef-4646-a546-6a03d0443231-kube-api-access-jhd8j" (OuterVolumeSpecName: "kube-api-access-jhd8j") pod "8dfd65d1-e6ef-4646-a546-6a03d0443231" (UID: "8dfd65d1-e6ef-4646-a546-6a03d0443231"). InnerVolumeSpecName "kube-api-access-jhd8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.252404 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f968a95-a0b1-4f56-886b-64674656f645-kube-api-access-ntmbw" (OuterVolumeSpecName: "kube-api-access-ntmbw") pod "8f968a95-a0b1-4f56-886b-64674656f645" (UID: "8f968a95-a0b1-4f56-886b-64674656f645"). InnerVolumeSpecName "kube-api-access-ntmbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.347806 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhd8j\" (UniqueName: \"kubernetes.io/projected/8dfd65d1-e6ef-4646-a546-6a03d0443231-kube-api-access-jhd8j\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.347836 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntmbw\" (UniqueName: \"kubernetes.io/projected/8f968a95-a0b1-4f56-886b-64674656f645-kube-api-access-ntmbw\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.799899 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d56-account-create-m645b" Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.799953 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9bncb" Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.802709 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d56-account-create-m645b" event={"ID":"8dfd65d1-e6ef-4646-a546-6a03d0443231","Type":"ContainerDied","Data":"eee7a7b7c73f248bea4ec6e1644b92a8e4fca435362dae12027248b051a3be5a"} Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.802742 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eee7a7b7c73f248bea4ec6e1644b92a8e4fca435362dae12027248b051a3be5a" Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.802754 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9bncb" event={"ID":"8f968a95-a0b1-4f56-886b-64674656f645","Type":"ContainerDied","Data":"b794f9330ed0b7b3e42e013f4a8b6940c8c4e6405986c9288dd177feee1d69ed"} Oct 03 14:18:10 crc kubenswrapper[4636]: I1003 14:18:10.802763 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b794f9330ed0b7b3e42e013f4a8b6940c8c4e6405986c9288dd177feee1d69ed" Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.548654 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6e1e-account-create-5f99d"] Oct 03 14:18:12 crc kubenswrapper[4636]: E1003 14:18:12.548998 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfd65d1-e6ef-4646-a546-6a03d0443231" containerName="mariadb-account-create" Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.549288 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfd65d1-e6ef-4646-a546-6a03d0443231" containerName="mariadb-account-create" Oct 03 14:18:12 crc kubenswrapper[4636]: E1003 14:18:12.549336 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f968a95-a0b1-4f56-886b-64674656f645" containerName="mariadb-database-create" Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.549346 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f968a95-a0b1-4f56-886b-64674656f645" containerName="mariadb-database-create" Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.549514 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f968a95-a0b1-4f56-886b-64674656f645" containerName="mariadb-database-create" Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.549526 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfd65d1-e6ef-4646-a546-6a03d0443231" containerName="mariadb-account-create" Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.555415 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6e1e-account-create-5f99d" Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.557746 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.567417 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6e1e-account-create-5f99d"] Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.591092 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-954p4\" (UniqueName: \"kubernetes.io/projected/659ed666-3a55-49bb-a35e-a59098f195d0-kube-api-access-954p4\") pod \"glance-6e1e-account-create-5f99d\" (UID: \"659ed666-3a55-49bb-a35e-a59098f195d0\") " pod="openstack/glance-6e1e-account-create-5f99d" Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.692262 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-954p4\" (UniqueName: \"kubernetes.io/projected/659ed666-3a55-49bb-a35e-a59098f195d0-kube-api-access-954p4\") pod \"glance-6e1e-account-create-5f99d\" (UID: \"659ed666-3a55-49bb-a35e-a59098f195d0\") " pod="openstack/glance-6e1e-account-create-5f99d" Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.707886 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-954p4\" (UniqueName: \"kubernetes.io/projected/659ed666-3a55-49bb-a35e-a59098f195d0-kube-api-access-954p4\") pod \"glance-6e1e-account-create-5f99d\" (UID: \"659ed666-3a55-49bb-a35e-a59098f195d0\") " pod="openstack/glance-6e1e-account-create-5f99d" Oct 03 14:18:12 crc kubenswrapper[4636]: I1003 14:18:12.885694 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6e1e-account-create-5f99d" Oct 03 14:18:13 crc kubenswrapper[4636]: I1003 14:18:13.326929 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6e1e-account-create-5f99d"] Oct 03 14:18:13 crc kubenswrapper[4636]: I1003 14:18:13.818460 4636 generic.go:334] "Generic (PLEG): container finished" podID="659ed666-3a55-49bb-a35e-a59098f195d0" containerID="1f0f864b1513ec5d23a369230ba27c7d5d37fec35beb05061cc0c47c8b0ddb89" exitCode=0 Oct 03 14:18:13 crc kubenswrapper[4636]: I1003 14:18:13.818542 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6e1e-account-create-5f99d" event={"ID":"659ed666-3a55-49bb-a35e-a59098f195d0","Type":"ContainerDied","Data":"1f0f864b1513ec5d23a369230ba27c7d5d37fec35beb05061cc0c47c8b0ddb89"} Oct 03 14:18:13 crc kubenswrapper[4636]: I1003 14:18:13.818799 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6e1e-account-create-5f99d" event={"ID":"659ed666-3a55-49bb-a35e-a59098f195d0","Type":"ContainerStarted","Data":"34880d04e2a5a1584516b754e98af2100a659dc21fecb4ade2c09fcd9b6f0110"} Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.267796 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-448f-account-create-prkdj"] Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.268802 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-448f-account-create-prkdj" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.271687 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.293188 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-448f-account-create-prkdj"] Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.320804 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ds64\" (UniqueName: \"kubernetes.io/projected/888c174c-e532-4731-a87a-7490a32c8e8b-kube-api-access-2ds64\") pod \"cinder-448f-account-create-prkdj\" (UID: \"888c174c-e532-4731-a87a-7490a32c8e8b\") " pod="openstack/cinder-448f-account-create-prkdj" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.359798 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-dc03-account-create-rwrxn"] Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.360948 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dc03-account-create-rwrxn" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.365012 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.378427 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-dc03-account-create-rwrxn"] Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.422925 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ds64\" (UniqueName: \"kubernetes.io/projected/888c174c-e532-4731-a87a-7490a32c8e8b-kube-api-access-2ds64\") pod \"cinder-448f-account-create-prkdj\" (UID: \"888c174c-e532-4731-a87a-7490a32c8e8b\") " pod="openstack/cinder-448f-account-create-prkdj" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.423081 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l268v\" (UniqueName: \"kubernetes.io/projected/4dd02d0a-c1bc-4e2d-a682-7b2db952669a-kube-api-access-l268v\") pod \"barbican-dc03-account-create-rwrxn\" (UID: \"4dd02d0a-c1bc-4e2d-a682-7b2db952669a\") " pod="openstack/barbican-dc03-account-create-rwrxn" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.451568 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ds64\" (UniqueName: \"kubernetes.io/projected/888c174c-e532-4731-a87a-7490a32c8e8b-kube-api-access-2ds64\") pod \"cinder-448f-account-create-prkdj\" (UID: \"888c174c-e532-4731-a87a-7490a32c8e8b\") " pod="openstack/cinder-448f-account-create-prkdj" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.524614 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l268v\" (UniqueName: \"kubernetes.io/projected/4dd02d0a-c1bc-4e2d-a682-7b2db952669a-kube-api-access-l268v\") pod \"barbican-dc03-account-create-rwrxn\" (UID: \"4dd02d0a-c1bc-4e2d-a682-7b2db952669a\") " pod="openstack/barbican-dc03-account-create-rwrxn" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.541002 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l268v\" (UniqueName: \"kubernetes.io/projected/4dd02d0a-c1bc-4e2d-a682-7b2db952669a-kube-api-access-l268v\") pod \"barbican-dc03-account-create-rwrxn\" (UID: \"4dd02d0a-c1bc-4e2d-a682-7b2db952669a\") " pod="openstack/barbican-dc03-account-create-rwrxn" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.567176 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c50b-account-create-qn2bs"] Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.568116 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c50b-account-create-qn2bs" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.569969 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.579849 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c50b-account-create-qn2bs"] Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.599681 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-448f-account-create-prkdj" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.626210 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl9kg\" (UniqueName: \"kubernetes.io/projected/6fe7893f-d888-4ac1-8179-4aa5322618f1-kube-api-access-vl9kg\") pod \"neutron-c50b-account-create-qn2bs\" (UID: \"6fe7893f-d888-4ac1-8179-4aa5322618f1\") " pod="openstack/neutron-c50b-account-create-qn2bs" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.677109 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dc03-account-create-rwrxn" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.727600 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl9kg\" (UniqueName: \"kubernetes.io/projected/6fe7893f-d888-4ac1-8179-4aa5322618f1-kube-api-access-vl9kg\") pod \"neutron-c50b-account-create-qn2bs\" (UID: \"6fe7893f-d888-4ac1-8179-4aa5322618f1\") " pod="openstack/neutron-c50b-account-create-qn2bs" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.745707 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl9kg\" (UniqueName: \"kubernetes.io/projected/6fe7893f-d888-4ac1-8179-4aa5322618f1-kube-api-access-vl9kg\") pod \"neutron-c50b-account-create-qn2bs\" (UID: \"6fe7893f-d888-4ac1-8179-4aa5322618f1\") " pod="openstack/neutron-c50b-account-create-qn2bs" Oct 03 14:18:14 crc kubenswrapper[4636]: I1003 14:18:14.910016 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c50b-account-create-qn2bs" Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.068224 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-448f-account-create-prkdj"] Oct 03 14:18:15 crc kubenswrapper[4636]: W1003 14:18:15.081737 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod888c174c_e532_4731_a87a_7490a32c8e8b.slice/crio-f35795b277ea651c039798660b454616dbf8d304ab6423f0646892d11c132a5d WatchSource:0}: Error finding container f35795b277ea651c039798660b454616dbf8d304ab6423f0646892d11c132a5d: Status 404 returned error can't find the container with id f35795b277ea651c039798660b454616dbf8d304ab6423f0646892d11c132a5d Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.141598 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6e1e-account-create-5f99d" Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.144932 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-dc03-account-create-rwrxn"] Oct 03 14:18:15 crc kubenswrapper[4636]: W1003 14:18:15.162784 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dd02d0a_c1bc_4e2d_a682_7b2db952669a.slice/crio-a2ecf76518488d7a5fe4820dcced215bc672a3965577a4ded448facd24c880e5 WatchSource:0}: Error finding container a2ecf76518488d7a5fe4820dcced215bc672a3965577a4ded448facd24c880e5: Status 404 returned error can't find the container with id a2ecf76518488d7a5fe4820dcced215bc672a3965577a4ded448facd24c880e5 Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.184034 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c50b-account-create-qn2bs"] Oct 03 14:18:15 crc kubenswrapper[4636]: W1003 14:18:15.204185 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fe7893f_d888_4ac1_8179_4aa5322618f1.slice/crio-a48c9b82ad7980735855edb72d5af493c4d0bf68948cb990ba00386230c40cdd WatchSource:0}: Error finding container a48c9b82ad7980735855edb72d5af493c4d0bf68948cb990ba00386230c40cdd: Status 404 returned error can't find the container with id a48c9b82ad7980735855edb72d5af493c4d0bf68948cb990ba00386230c40cdd Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.240190 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-954p4\" (UniqueName: \"kubernetes.io/projected/659ed666-3a55-49bb-a35e-a59098f195d0-kube-api-access-954p4\") pod \"659ed666-3a55-49bb-a35e-a59098f195d0\" (UID: \"659ed666-3a55-49bb-a35e-a59098f195d0\") " Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.247649 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659ed666-3a55-49bb-a35e-a59098f195d0-kube-api-access-954p4" (OuterVolumeSpecName: "kube-api-access-954p4") pod "659ed666-3a55-49bb-a35e-a59098f195d0" (UID: "659ed666-3a55-49bb-a35e-a59098f195d0"). InnerVolumeSpecName "kube-api-access-954p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.345278 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-954p4\" (UniqueName: \"kubernetes.io/projected/659ed666-3a55-49bb-a35e-a59098f195d0-kube-api-access-954p4\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.835887 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6e1e-account-create-5f99d" Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.835889 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6e1e-account-create-5f99d" event={"ID":"659ed666-3a55-49bb-a35e-a59098f195d0","Type":"ContainerDied","Data":"34880d04e2a5a1584516b754e98af2100a659dc21fecb4ade2c09fcd9b6f0110"} Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.835961 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34880d04e2a5a1584516b754e98af2100a659dc21fecb4ade2c09fcd9b6f0110" Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.837579 4636 generic.go:334] "Generic (PLEG): container finished" podID="6fe7893f-d888-4ac1-8179-4aa5322618f1" containerID="bd8c444f0feca9e0b1e45ed016bfbdb580cb90fd571e9006cef20983f9ec3930" exitCode=0 Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.837655 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c50b-account-create-qn2bs" event={"ID":"6fe7893f-d888-4ac1-8179-4aa5322618f1","Type":"ContainerDied","Data":"bd8c444f0feca9e0b1e45ed016bfbdb580cb90fd571e9006cef20983f9ec3930"} Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.837680 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c50b-account-create-qn2bs" event={"ID":"6fe7893f-d888-4ac1-8179-4aa5322618f1","Type":"ContainerStarted","Data":"a48c9b82ad7980735855edb72d5af493c4d0bf68948cb990ba00386230c40cdd"} Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.838962 4636 generic.go:334] "Generic (PLEG): container finished" podID="4dd02d0a-c1bc-4e2d-a682-7b2db952669a" containerID="4e8ccede73b08a34d31fa25ce8f102a724dd4669e4e79415b0f717f35ab25cf2" exitCode=0 Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.839013 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dc03-account-create-rwrxn" event={"ID":"4dd02d0a-c1bc-4e2d-a682-7b2db952669a","Type":"ContainerDied","Data":"4e8ccede73b08a34d31fa25ce8f102a724dd4669e4e79415b0f717f35ab25cf2"} Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.839029 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dc03-account-create-rwrxn" event={"ID":"4dd02d0a-c1bc-4e2d-a682-7b2db952669a","Type":"ContainerStarted","Data":"a2ecf76518488d7a5fe4820dcced215bc672a3965577a4ded448facd24c880e5"} Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.840175 4636 generic.go:334] "Generic (PLEG): container finished" podID="888c174c-e532-4731-a87a-7490a32c8e8b" containerID="4ee2dc0da05e851a61daa34a3a7591c2d7340b64d96a80d2c7d38250f1eac576" exitCode=0 Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.840203 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-448f-account-create-prkdj" event={"ID":"888c174c-e532-4731-a87a-7490a32c8e8b","Type":"ContainerDied","Data":"4ee2dc0da05e851a61daa34a3a7591c2d7340b64d96a80d2c7d38250f1eac576"} Oct 03 14:18:15 crc kubenswrapper[4636]: I1003 14:18:15.840218 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-448f-account-create-prkdj" event={"ID":"888c174c-e532-4731-a87a-7490a32c8e8b","Type":"ContainerStarted","Data":"f35795b277ea651c039798660b454616dbf8d304ab6423f0646892d11c132a5d"} Oct 03 14:18:16 crc kubenswrapper[4636]: I1003 14:18:16.935873 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4584-account-create-sbqsq"] Oct 03 14:18:16 crc kubenswrapper[4636]: E1003 14:18:16.936470 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659ed666-3a55-49bb-a35e-a59098f195d0" containerName="mariadb-account-create" Oct 03 14:18:16 crc kubenswrapper[4636]: I1003 14:18:16.936481 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="659ed666-3a55-49bb-a35e-a59098f195d0" containerName="mariadb-account-create" Oct 03 14:18:16 crc kubenswrapper[4636]: I1003 14:18:16.936618 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="659ed666-3a55-49bb-a35e-a59098f195d0" containerName="mariadb-account-create" Oct 03 14:18:16 crc kubenswrapper[4636]: I1003 14:18:16.937194 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4584-account-create-sbqsq" Oct 03 14:18:16 crc kubenswrapper[4636]: I1003 14:18:16.942671 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 03 14:18:16 crc kubenswrapper[4636]: I1003 14:18:16.957609 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4584-account-create-sbqsq"] Oct 03 14:18:16 crc kubenswrapper[4636]: I1003 14:18:16.970234 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbpbq\" (UniqueName: \"kubernetes.io/projected/1da9d9e1-4437-471a-9a5b-d507a11f1695-kube-api-access-zbpbq\") pod \"keystone-4584-account-create-sbqsq\" (UID: \"1da9d9e1-4437-471a-9a5b-d507a11f1695\") " pod="openstack/keystone-4584-account-create-sbqsq" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.071861 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbpbq\" (UniqueName: \"kubernetes.io/projected/1da9d9e1-4437-471a-9a5b-d507a11f1695-kube-api-access-zbpbq\") pod \"keystone-4584-account-create-sbqsq\" (UID: \"1da9d9e1-4437-471a-9a5b-d507a11f1695\") " pod="openstack/keystone-4584-account-create-sbqsq" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.098376 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbpbq\" (UniqueName: \"kubernetes.io/projected/1da9d9e1-4437-471a-9a5b-d507a11f1695-kube-api-access-zbpbq\") pod \"keystone-4584-account-create-sbqsq\" (UID: \"1da9d9e1-4437-471a-9a5b-d507a11f1695\") " pod="openstack/keystone-4584-account-create-sbqsq" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.162880 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dc03-account-create-rwrxn" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.255907 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-448f-account-create-prkdj" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.260074 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4584-account-create-sbqsq" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.273742 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c50b-account-create-qn2bs" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.277267 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l268v\" (UniqueName: \"kubernetes.io/projected/4dd02d0a-c1bc-4e2d-a682-7b2db952669a-kube-api-access-l268v\") pod \"4dd02d0a-c1bc-4e2d-a682-7b2db952669a\" (UID: \"4dd02d0a-c1bc-4e2d-a682-7b2db952669a\") " Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.284380 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd02d0a-c1bc-4e2d-a682-7b2db952669a-kube-api-access-l268v" (OuterVolumeSpecName: "kube-api-access-l268v") pod "4dd02d0a-c1bc-4e2d-a682-7b2db952669a" (UID: "4dd02d0a-c1bc-4e2d-a682-7b2db952669a"). InnerVolumeSpecName "kube-api-access-l268v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.378882 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl9kg\" (UniqueName: \"kubernetes.io/projected/6fe7893f-d888-4ac1-8179-4aa5322618f1-kube-api-access-vl9kg\") pod \"6fe7893f-d888-4ac1-8179-4aa5322618f1\" (UID: \"6fe7893f-d888-4ac1-8179-4aa5322618f1\") " Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.378972 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ds64\" (UniqueName: \"kubernetes.io/projected/888c174c-e532-4731-a87a-7490a32c8e8b-kube-api-access-2ds64\") pod \"888c174c-e532-4731-a87a-7490a32c8e8b\" (UID: \"888c174c-e532-4731-a87a-7490a32c8e8b\") " Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.379392 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l268v\" (UniqueName: \"kubernetes.io/projected/4dd02d0a-c1bc-4e2d-a682-7b2db952669a-kube-api-access-l268v\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.382350 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888c174c-e532-4731-a87a-7490a32c8e8b-kube-api-access-2ds64" (OuterVolumeSpecName: "kube-api-access-2ds64") pod "888c174c-e532-4731-a87a-7490a32c8e8b" (UID: "888c174c-e532-4731-a87a-7490a32c8e8b"). InnerVolumeSpecName "kube-api-access-2ds64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.383000 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe7893f-d888-4ac1-8179-4aa5322618f1-kube-api-access-vl9kg" (OuterVolumeSpecName: "kube-api-access-vl9kg") pod "6fe7893f-d888-4ac1-8179-4aa5322618f1" (UID: "6fe7893f-d888-4ac1-8179-4aa5322618f1"). InnerVolumeSpecName "kube-api-access-vl9kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.481485 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ds64\" (UniqueName: \"kubernetes.io/projected/888c174c-e532-4731-a87a-7490a32c8e8b-kube-api-access-2ds64\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.481515 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl9kg\" (UniqueName: \"kubernetes.io/projected/6fe7893f-d888-4ac1-8179-4aa5322618f1-kube-api-access-vl9kg\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.697924 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8sj76"] Oct 03 14:18:17 crc kubenswrapper[4636]: E1003 14:18:17.698576 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe7893f-d888-4ac1-8179-4aa5322618f1" containerName="mariadb-account-create" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.698594 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe7893f-d888-4ac1-8179-4aa5322618f1" containerName="mariadb-account-create" Oct 03 14:18:17 crc kubenswrapper[4636]: E1003 14:18:17.698609 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888c174c-e532-4731-a87a-7490a32c8e8b" containerName="mariadb-account-create" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.698614 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="888c174c-e532-4731-a87a-7490a32c8e8b" containerName="mariadb-account-create" Oct 03 14:18:17 crc kubenswrapper[4636]: E1003 14:18:17.698655 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd02d0a-c1bc-4e2d-a682-7b2db952669a" containerName="mariadb-account-create" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.698663 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd02d0a-c1bc-4e2d-a682-7b2db952669a" containerName="mariadb-account-create" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.698804 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe7893f-d888-4ac1-8179-4aa5322618f1" containerName="mariadb-account-create" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.698822 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="888c174c-e532-4731-a87a-7490a32c8e8b" containerName="mariadb-account-create" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.698833 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd02d0a-c1bc-4e2d-a682-7b2db952669a" containerName="mariadb-account-create" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.699397 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.701679 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ns26q" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.702208 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.724444 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8sj76"] Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.786415 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-combined-ca-bundle\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.786474 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptm47\" (UniqueName: \"kubernetes.io/projected/bc578a05-9113-4226-bf5e-a8e907722e8e-kube-api-access-ptm47\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.786510 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-config-data\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.786556 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-db-sync-config-data\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.878555 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c50b-account-create-qn2bs" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.879046 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c50b-account-create-qn2bs" event={"ID":"6fe7893f-d888-4ac1-8179-4aa5322618f1","Type":"ContainerDied","Data":"a48c9b82ad7980735855edb72d5af493c4d0bf68948cb990ba00386230c40cdd"} Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.882635 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a48c9b82ad7980735855edb72d5af493c4d0bf68948cb990ba00386230c40cdd" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.885137 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dc03-account-create-rwrxn" event={"ID":"4dd02d0a-c1bc-4e2d-a682-7b2db952669a","Type":"ContainerDied","Data":"a2ecf76518488d7a5fe4820dcced215bc672a3965577a4ded448facd24c880e5"} Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.885227 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2ecf76518488d7a5fe4820dcced215bc672a3965577a4ded448facd24c880e5" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.885331 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dc03-account-create-rwrxn" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.887617 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-db-sync-config-data\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.887729 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-combined-ca-bundle\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.887799 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptm47\" (UniqueName: \"kubernetes.io/projected/bc578a05-9113-4226-bf5e-a8e907722e8e-kube-api-access-ptm47\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.891300 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-config-data\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.892171 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-448f-account-create-prkdj" event={"ID":"888c174c-e532-4731-a87a-7490a32c8e8b","Type":"ContainerDied","Data":"f35795b277ea651c039798660b454616dbf8d304ab6423f0646892d11c132a5d"} Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.892224 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f35795b277ea651c039798660b454616dbf8d304ab6423f0646892d11c132a5d" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.892290 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-448f-account-create-prkdj" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.896594 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-db-sync-config-data\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.896602 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-combined-ca-bundle\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.896879 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-config-data\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.912675 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptm47\" (UniqueName: \"kubernetes.io/projected/bc578a05-9113-4226-bf5e-a8e907722e8e-kube-api-access-ptm47\") pod \"glance-db-sync-8sj76\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:17 crc kubenswrapper[4636]: I1003 14:18:17.971238 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4584-account-create-sbqsq"] Oct 03 14:18:17 crc kubenswrapper[4636]: W1003 14:18:17.976385 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1da9d9e1_4437_471a_9a5b_d507a11f1695.slice/crio-cee49e6fdab17c795a365e918c1bfbe68757fea2f0374e0b2eda37eb32975003 WatchSource:0}: Error finding container cee49e6fdab17c795a365e918c1bfbe68757fea2f0374e0b2eda37eb32975003: Status 404 returned error can't find the container with id cee49e6fdab17c795a365e918c1bfbe68757fea2f0374e0b2eda37eb32975003 Oct 03 14:18:18 crc kubenswrapper[4636]: I1003 14:18:18.017457 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:18 crc kubenswrapper[4636]: I1003 14:18:18.573691 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8sj76"] Oct 03 14:18:18 crc kubenswrapper[4636]: I1003 14:18:18.900508 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8sj76" event={"ID":"bc578a05-9113-4226-bf5e-a8e907722e8e","Type":"ContainerStarted","Data":"8056ce4594b505b24b1baccce523e7e52d040ad31f8a14f1e90abc4dc2964cd9"} Oct 03 14:18:18 crc kubenswrapper[4636]: I1003 14:18:18.905909 4636 generic.go:334] "Generic (PLEG): container finished" podID="1da9d9e1-4437-471a-9a5b-d507a11f1695" containerID="3f7d3e73e29a1e05ce1a3b37f2e7bd7a7ca98b3dfd17338119dc600b7b27ae37" exitCode=0 Oct 03 14:18:18 crc kubenswrapper[4636]: I1003 14:18:18.906033 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4584-account-create-sbqsq" event={"ID":"1da9d9e1-4437-471a-9a5b-d507a11f1695","Type":"ContainerDied","Data":"3f7d3e73e29a1e05ce1a3b37f2e7bd7a7ca98b3dfd17338119dc600b7b27ae37"} Oct 03 14:18:18 crc kubenswrapper[4636]: I1003 14:18:18.906357 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4584-account-create-sbqsq" event={"ID":"1da9d9e1-4437-471a-9a5b-d507a11f1695","Type":"ContainerStarted","Data":"cee49e6fdab17c795a365e918c1bfbe68757fea2f0374e0b2eda37eb32975003"} Oct 03 14:18:20 crc kubenswrapper[4636]: I1003 14:18:20.240059 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4584-account-create-sbqsq" Oct 03 14:18:20 crc kubenswrapper[4636]: I1003 14:18:20.343245 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbpbq\" (UniqueName: \"kubernetes.io/projected/1da9d9e1-4437-471a-9a5b-d507a11f1695-kube-api-access-zbpbq\") pod \"1da9d9e1-4437-471a-9a5b-d507a11f1695\" (UID: \"1da9d9e1-4437-471a-9a5b-d507a11f1695\") " Oct 03 14:18:20 crc kubenswrapper[4636]: I1003 14:18:20.349536 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da9d9e1-4437-471a-9a5b-d507a11f1695-kube-api-access-zbpbq" (OuterVolumeSpecName: "kube-api-access-zbpbq") pod "1da9d9e1-4437-471a-9a5b-d507a11f1695" (UID: "1da9d9e1-4437-471a-9a5b-d507a11f1695"). InnerVolumeSpecName "kube-api-access-zbpbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:20 crc kubenswrapper[4636]: I1003 14:18:20.445431 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbpbq\" (UniqueName: \"kubernetes.io/projected/1da9d9e1-4437-471a-9a5b-d507a11f1695-kube-api-access-zbpbq\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:20 crc kubenswrapper[4636]: I1003 14:18:20.923212 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4584-account-create-sbqsq" event={"ID":"1da9d9e1-4437-471a-9a5b-d507a11f1695","Type":"ContainerDied","Data":"cee49e6fdab17c795a365e918c1bfbe68757fea2f0374e0b2eda37eb32975003"} Oct 03 14:18:20 crc kubenswrapper[4636]: I1003 14:18:20.923483 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cee49e6fdab17c795a365e918c1bfbe68757fea2f0374e0b2eda37eb32975003" Oct 03 14:18:20 crc kubenswrapper[4636]: I1003 14:18:20.923285 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4584-account-create-sbqsq" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.521826 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-mg222"] Oct 03 14:18:22 crc kubenswrapper[4636]: E1003 14:18:22.522315 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da9d9e1-4437-471a-9a5b-d507a11f1695" containerName="mariadb-account-create" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.522329 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da9d9e1-4437-471a-9a5b-d507a11f1695" containerName="mariadb-account-create" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.522485 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da9d9e1-4437-471a-9a5b-d507a11f1695" containerName="mariadb-account-create" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.523018 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.532402 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mg222"] Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.535781 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.535850 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l2krf" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.535799 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.536050 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.582215 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-combined-ca-bundle\") pod \"keystone-db-sync-mg222\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.582396 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2b8j\" (UniqueName: \"kubernetes.io/projected/71c19545-0af9-461d-bf0b-ba0a08f8dbff-kube-api-access-t2b8j\") pod \"keystone-db-sync-mg222\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.582473 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-config-data\") pod \"keystone-db-sync-mg222\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.684430 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-combined-ca-bundle\") pod \"keystone-db-sync-mg222\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.684773 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2b8j\" (UniqueName: \"kubernetes.io/projected/71c19545-0af9-461d-bf0b-ba0a08f8dbff-kube-api-access-t2b8j\") pod \"keystone-db-sync-mg222\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.684841 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-config-data\") pod \"keystone-db-sync-mg222\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.708788 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-config-data\") pod \"keystone-db-sync-mg222\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.711203 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-combined-ca-bundle\") pod \"keystone-db-sync-mg222\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.714279 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2b8j\" (UniqueName: \"kubernetes.io/projected/71c19545-0af9-461d-bf0b-ba0a08f8dbff-kube-api-access-t2b8j\") pod \"keystone-db-sync-mg222\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:22 crc kubenswrapper[4636]: I1003 14:18:22.859519 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:23 crc kubenswrapper[4636]: I1003 14:18:23.368442 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mg222"] Oct 03 14:18:23 crc kubenswrapper[4636]: I1003 14:18:23.970924 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mg222" event={"ID":"71c19545-0af9-461d-bf0b-ba0a08f8dbff","Type":"ContainerStarted","Data":"f927f28305ae54f83b4b0435a08008ebda8d2c81d241a2669dfa004d5a6ccad9"} Oct 03 14:18:24 crc kubenswrapper[4636]: I1003 14:18:24.214646 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:18:24 crc kubenswrapper[4636]: I1003 14:18:24.230886 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/201b506e-9cc5-4ab0-9af4-96a357d19f6e-etc-swift\") pod \"swift-storage-0\" (UID: \"201b506e-9cc5-4ab0-9af4-96a357d19f6e\") " pod="openstack/swift-storage-0" Oct 03 14:18:24 crc kubenswrapper[4636]: I1003 14:18:24.405046 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 14:18:24 crc kubenswrapper[4636]: I1003 14:18:24.999121 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 14:18:33 crc kubenswrapper[4636]: I1003 14:18:33.051176 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"97b984315e5dae6c588cf212f710b5b9126de5a094457598ab899b7a9ca66422"} Oct 03 14:18:35 crc kubenswrapper[4636]: E1003 14:18:35.594524 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-keystone:current-podified" Oct 03 14:18:35 crc kubenswrapper[4636]: E1003 14:18:35.594975 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:keystone-db-sync,Image:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,Command:[/bin/bash],Args:[-c keystone-manage db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/keystone/keystone.conf,SubPath:keystone.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2b8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42425,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42425,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-db-sync-mg222_openstack(71c19545-0af9-461d-bf0b-ba0a08f8dbff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:18:35 crc kubenswrapper[4636]: E1003 14:18:35.596174 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/keystone-db-sync-mg222" podUID="71c19545-0af9-461d-bf0b-ba0a08f8dbff" Oct 03 14:18:36 crc kubenswrapper[4636]: E1003 14:18:36.077892 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-keystone:current-podified\\\"\"" pod="openstack/keystone-db-sync-mg222" podUID="71c19545-0af9-461d-bf0b-ba0a08f8dbff" Oct 03 14:18:37 crc kubenswrapper[4636]: I1003 14:18:37.085003 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8sj76" event={"ID":"bc578a05-9113-4226-bf5e-a8e907722e8e","Type":"ContainerStarted","Data":"9c3f56e587df9ba67da7828ee218e2c1da5b0a865be8ce95542c9b46c452883e"} Oct 03 14:18:37 crc kubenswrapper[4636]: I1003 14:18:37.088704 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"7da917e5d636048ea9d63dcb41c039b4a94a5829a8a842665339e3d2dcf2db42"} Oct 03 14:18:37 crc kubenswrapper[4636]: I1003 14:18:37.088846 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"b903dbe6bddd857896307a33c1d9408261d67ab07902e84bb718ee52591b18ee"} Oct 03 14:18:37 crc kubenswrapper[4636]: I1003 14:18:37.088957 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"60289c0cc69273473d6c1aa0961f7f11296f038abcae050ce89cbace128efb79"} Oct 03 14:18:37 crc kubenswrapper[4636]: I1003 14:18:37.089055 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"6f29bdb7f1bc928f9ee16b47a633b91a1518b657548e240b27503ad1faf97852"} Oct 03 14:18:37 crc kubenswrapper[4636]: I1003 14:18:37.100701 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8sj76" podStartSLOduration=3.049204221 podStartE2EDuration="20.100680624s" podCreationTimestamp="2025-10-03 14:18:17 +0000 UTC" firstStartedPulling="2025-10-03 14:18:18.587454819 +0000 UTC m=+1048.446181066" lastFinishedPulling="2025-10-03 14:18:35.638931222 +0000 UTC m=+1065.497657469" observedRunningTime="2025-10-03 14:18:37.099333959 +0000 UTC m=+1066.958060206" watchObservedRunningTime="2025-10-03 14:18:37.100680624 +0000 UTC m=+1066.959406871" Oct 03 14:18:39 crc kubenswrapper[4636]: I1003 14:18:39.111549 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"7bbac583818c2b31a8892961528a2fc40ee8171a054b4209f0d76a24c318b98d"} Oct 03 14:18:39 crc kubenswrapper[4636]: I1003 14:18:39.111869 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"6a5e7b3bd04fabee36e6c971ae80aa7a7983ca1983f5d0f767776f0b3eab384b"} Oct 03 14:18:39 crc kubenswrapper[4636]: I1003 14:18:39.111880 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"11b8cd03b6a7044bf25636ac366d9cbeaf2998ea5f0e349118cec58b4f9a396c"} Oct 03 14:18:40 crc kubenswrapper[4636]: I1003 14:18:40.124810 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"a2c23007b28396a206d007dec817370126568b24360db5828710399ff8e3f802"} Oct 03 14:18:41 crc kubenswrapper[4636]: I1003 14:18:41.144909 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"22fa60ac99bf4d722d680eee4b76a4939791dda65d0651c79b5e5820b2f396de"} Oct 03 14:18:41 crc kubenswrapper[4636]: I1003 14:18:41.145300 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"01cd4ae1bd5c9d3fccc5894085995097b2370eca1c0174169f5f456df6b6d6c3"} Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.164574 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"cf11a24f4adb4d8fb9edf6eb897300744b58957bd80c32e20ee70a9b0a9a643b"} Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.165973 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"c808ce10d177d44eb8d28ad3955849e4058316141a4b901a5c69c68fd47541ac"} Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.166065 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"4bfbfc14e323014a4cd7431012a1fccbd582a810f0ef37f5dbea4a88edadc8b0"} Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.166172 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"e5bbfdf5397f8d41a8d5e45d60724384ae6869e696bb5bea5749ff86e1aef042"} Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.166355 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"201b506e-9cc5-4ab0-9af4-96a357d19f6e","Type":"ContainerStarted","Data":"0724f09c2f57b20a0f135d4665867e0039850c3d36f73678ac9926fcbc5fdfdf"} Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.238294 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=74.945416638 podStartE2EDuration="1m23.238275362s" podCreationTimestamp="2025-10-03 14:17:19 +0000 UTC" firstStartedPulling="2025-10-03 14:18:32.504191915 +0000 UTC m=+1062.362918162" lastFinishedPulling="2025-10-03 14:18:40.797050639 +0000 UTC m=+1070.655776886" observedRunningTime="2025-10-03 14:18:42.224546088 +0000 UTC m=+1072.083272355" watchObservedRunningTime="2025-10-03 14:18:42.238275362 +0000 UTC m=+1072.097001609" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.473643 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-nt8td"] Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.475288 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.478942 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.499699 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-nt8td"] Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.539302 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msb6p\" (UniqueName: \"kubernetes.io/projected/8f822c2c-795c-4729-971d-b63859faff56-kube-api-access-msb6p\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.539363 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-svc\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.539393 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.539592 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.539745 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.540122 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-config\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.641711 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-config\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.641800 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msb6p\" (UniqueName: \"kubernetes.io/projected/8f822c2c-795c-4729-971d-b63859faff56-kube-api-access-msb6p\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.641855 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-svc\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.641888 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.641948 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.642006 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.642735 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-config\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.643152 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.643209 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.643243 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-svc\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.643625 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.665963 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msb6p\" (UniqueName: \"kubernetes.io/projected/8f822c2c-795c-4729-971d-b63859faff56-kube-api-access-msb6p\") pod \"dnsmasq-dns-764c5664d7-nt8td\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:42 crc kubenswrapper[4636]: I1003 14:18:42.794339 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:43 crc kubenswrapper[4636]: I1003 14:18:43.277896 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-nt8td"] Oct 03 14:18:44 crc kubenswrapper[4636]: I1003 14:18:44.180318 4636 generic.go:334] "Generic (PLEG): container finished" podID="8f822c2c-795c-4729-971d-b63859faff56" containerID="230598acea54227e5b3599f0d92837292557002cb851123807d88c64402ff3f1" exitCode=0 Oct 03 14:18:44 crc kubenswrapper[4636]: I1003 14:18:44.180364 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" event={"ID":"8f822c2c-795c-4729-971d-b63859faff56","Type":"ContainerDied","Data":"230598acea54227e5b3599f0d92837292557002cb851123807d88c64402ff3f1"} Oct 03 14:18:44 crc kubenswrapper[4636]: I1003 14:18:44.180733 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" event={"ID":"8f822c2c-795c-4729-971d-b63859faff56","Type":"ContainerStarted","Data":"3c62067300b176cecb08ae7f3a72f6543bd4581ebd7da55426547bee8a22c5df"} Oct 03 14:18:45 crc kubenswrapper[4636]: I1003 14:18:45.205966 4636 generic.go:334] "Generic (PLEG): container finished" podID="bc578a05-9113-4226-bf5e-a8e907722e8e" containerID="9c3f56e587df9ba67da7828ee218e2c1da5b0a865be8ce95542c9b46c452883e" exitCode=0 Oct 03 14:18:45 crc kubenswrapper[4636]: I1003 14:18:45.206233 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8sj76" event={"ID":"bc578a05-9113-4226-bf5e-a8e907722e8e","Type":"ContainerDied","Data":"9c3f56e587df9ba67da7828ee218e2c1da5b0a865be8ce95542c9b46c452883e"} Oct 03 14:18:45 crc kubenswrapper[4636]: I1003 14:18:45.209672 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" event={"ID":"8f822c2c-795c-4729-971d-b63859faff56","Type":"ContainerStarted","Data":"0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d"} Oct 03 14:18:45 crc kubenswrapper[4636]: I1003 14:18:45.209819 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:45 crc kubenswrapper[4636]: I1003 14:18:45.249519 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" podStartSLOduration=3.249499775 podStartE2EDuration="3.249499775s" podCreationTimestamp="2025-10-03 14:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:18:45.24619649 +0000 UTC m=+1075.104922737" watchObservedRunningTime="2025-10-03 14:18:45.249499775 +0000 UTC m=+1075.108226022" Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.589226 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.705583 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-db-sync-config-data\") pod \"bc578a05-9113-4226-bf5e-a8e907722e8e\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.705641 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptm47\" (UniqueName: \"kubernetes.io/projected/bc578a05-9113-4226-bf5e-a8e907722e8e-kube-api-access-ptm47\") pod \"bc578a05-9113-4226-bf5e-a8e907722e8e\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.705738 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-combined-ca-bundle\") pod \"bc578a05-9113-4226-bf5e-a8e907722e8e\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.705840 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-config-data\") pod \"bc578a05-9113-4226-bf5e-a8e907722e8e\" (UID: \"bc578a05-9113-4226-bf5e-a8e907722e8e\") " Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.711260 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bc578a05-9113-4226-bf5e-a8e907722e8e" (UID: "bc578a05-9113-4226-bf5e-a8e907722e8e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.713917 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc578a05-9113-4226-bf5e-a8e907722e8e-kube-api-access-ptm47" (OuterVolumeSpecName: "kube-api-access-ptm47") pod "bc578a05-9113-4226-bf5e-a8e907722e8e" (UID: "bc578a05-9113-4226-bf5e-a8e907722e8e"). InnerVolumeSpecName "kube-api-access-ptm47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.733156 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc578a05-9113-4226-bf5e-a8e907722e8e" (UID: "bc578a05-9113-4226-bf5e-a8e907722e8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.760491 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-config-data" (OuterVolumeSpecName: "config-data") pod "bc578a05-9113-4226-bf5e-a8e907722e8e" (UID: "bc578a05-9113-4226-bf5e-a8e907722e8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.807517 4636 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.807781 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptm47\" (UniqueName: \"kubernetes.io/projected/bc578a05-9113-4226-bf5e-a8e907722e8e-kube-api-access-ptm47\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.807869 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:46 crc kubenswrapper[4636]: I1003 14:18:46.807947 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc578a05-9113-4226-bf5e-a8e907722e8e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.225693 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8sj76" event={"ID":"bc578a05-9113-4226-bf5e-a8e907722e8e","Type":"ContainerDied","Data":"8056ce4594b505b24b1baccce523e7e52d040ad31f8a14f1e90abc4dc2964cd9"} Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.225737 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8056ce4594b505b24b1baccce523e7e52d040ad31f8a14f1e90abc4dc2964cd9" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.225715 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8sj76" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.628657 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-nt8td"] Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.629129 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" podUID="8f822c2c-795c-4729-971d-b63859faff56" containerName="dnsmasq-dns" containerID="cri-o://0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d" gracePeriod=10 Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.689397 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hjczt"] Oct 03 14:18:47 crc kubenswrapper[4636]: E1003 14:18:47.689770 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc578a05-9113-4226-bf5e-a8e907722e8e" containerName="glance-db-sync" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.689786 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc578a05-9113-4226-bf5e-a8e907722e8e" containerName="glance-db-sync" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.689954 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc578a05-9113-4226-bf5e-a8e907722e8e" containerName="glance-db-sync" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.690830 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.707605 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hjczt"] Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.723880 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-config\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.723930 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.723998 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.724020 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.724166 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rq7r\" (UniqueName: \"kubernetes.io/projected/69fefc36-2141-4a6e-b716-ae8e2e330bb8-kube-api-access-2rq7r\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.724204 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.825651 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.825694 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.825808 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rq7r\" (UniqueName: \"kubernetes.io/projected/69fefc36-2141-4a6e-b716-ae8e2e330bb8-kube-api-access-2rq7r\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.825838 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.825919 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-config\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.825938 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.826587 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.826628 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.826700 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.827287 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-config\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.827289 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:47 crc kubenswrapper[4636]: I1003 14:18:47.858606 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rq7r\" (UniqueName: \"kubernetes.io/projected/69fefc36-2141-4a6e-b716-ae8e2e330bb8-kube-api-access-2rq7r\") pod \"dnsmasq-dns-74f6bcbc87-hjczt\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.008936 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.128962 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.233172 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-config\") pod \"8f822c2c-795c-4729-971d-b63859faff56\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.233457 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-svc\") pod \"8f822c2c-795c-4729-971d-b63859faff56\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.233599 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-sb\") pod \"8f822c2c-795c-4729-971d-b63859faff56\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.233647 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msb6p\" (UniqueName: \"kubernetes.io/projected/8f822c2c-795c-4729-971d-b63859faff56-kube-api-access-msb6p\") pod \"8f822c2c-795c-4729-971d-b63859faff56\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.233720 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-nb\") pod \"8f822c2c-795c-4729-971d-b63859faff56\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.233775 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-swift-storage-0\") pod \"8f822c2c-795c-4729-971d-b63859faff56\" (UID: \"8f822c2c-795c-4729-971d-b63859faff56\") " Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.244465 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f822c2c-795c-4729-971d-b63859faff56-kube-api-access-msb6p" (OuterVolumeSpecName: "kube-api-access-msb6p") pod "8f822c2c-795c-4729-971d-b63859faff56" (UID: "8f822c2c-795c-4729-971d-b63859faff56"). InnerVolumeSpecName "kube-api-access-msb6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.249475 4636 generic.go:334] "Generic (PLEG): container finished" podID="8f822c2c-795c-4729-971d-b63859faff56" containerID="0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d" exitCode=0 Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.249531 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" event={"ID":"8f822c2c-795c-4729-971d-b63859faff56","Type":"ContainerDied","Data":"0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d"} Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.249559 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" event={"ID":"8f822c2c-795c-4729-971d-b63859faff56","Type":"ContainerDied","Data":"3c62067300b176cecb08ae7f3a72f6543bd4581ebd7da55426547bee8a22c5df"} Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.249578 4636 scope.go:117] "RemoveContainer" containerID="0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.249701 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-nt8td" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.303945 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8f822c2c-795c-4729-971d-b63859faff56" (UID: "8f822c2c-795c-4729-971d-b63859faff56"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.309546 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8f822c2c-795c-4729-971d-b63859faff56" (UID: "8f822c2c-795c-4729-971d-b63859faff56"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.312149 4636 scope.go:117] "RemoveContainer" containerID="230598acea54227e5b3599f0d92837292557002cb851123807d88c64402ff3f1" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.320273 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8f822c2c-795c-4729-971d-b63859faff56" (UID: "8f822c2c-795c-4729-971d-b63859faff56"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.334537 4636 scope.go:117] "RemoveContainer" containerID="0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.335897 4636 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.335931 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.335944 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msb6p\" (UniqueName: \"kubernetes.io/projected/8f822c2c-795c-4729-971d-b63859faff56-kube-api-access-msb6p\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.335957 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.337016 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8f822c2c-795c-4729-971d-b63859faff56" (UID: "8f822c2c-795c-4729-971d-b63859faff56"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:18:48 crc kubenswrapper[4636]: E1003 14:18:48.338334 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d\": container with ID starting with 0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d not found: ID does not exist" containerID="0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.338395 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d"} err="failed to get container status \"0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d\": rpc error: code = NotFound desc = could not find container \"0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d\": container with ID starting with 0f3f325cf3695e2ee8f3c15db081792627f08c779cbfe500d0992c1bd6fd5b2d not found: ID does not exist" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.338427 4636 scope.go:117] "RemoveContainer" containerID="230598acea54227e5b3599f0d92837292557002cb851123807d88c64402ff3f1" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.338619 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-config" (OuterVolumeSpecName: "config") pod "8f822c2c-795c-4729-971d-b63859faff56" (UID: "8f822c2c-795c-4729-971d-b63859faff56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:18:48 crc kubenswrapper[4636]: E1003 14:18:48.338998 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230598acea54227e5b3599f0d92837292557002cb851123807d88c64402ff3f1\": container with ID starting with 230598acea54227e5b3599f0d92837292557002cb851123807d88c64402ff3f1 not found: ID does not exist" containerID="230598acea54227e5b3599f0d92837292557002cb851123807d88c64402ff3f1" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.339038 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230598acea54227e5b3599f0d92837292557002cb851123807d88c64402ff3f1"} err="failed to get container status \"230598acea54227e5b3599f0d92837292557002cb851123807d88c64402ff3f1\": rpc error: code = NotFound desc = could not find container \"230598acea54227e5b3599f0d92837292557002cb851123807d88c64402ff3f1\": container with ID starting with 230598acea54227e5b3599f0d92837292557002cb851123807d88c64402ff3f1 not found: ID does not exist" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.439201 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.439243 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f822c2c-795c-4729-971d-b63859faff56-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.571198 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hjczt"] Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.592387 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-nt8td"] Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.601802 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-nt8td"] Oct 03 14:18:48 crc kubenswrapper[4636]: I1003 14:18:48.807086 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f822c2c-795c-4729-971d-b63859faff56" path="/var/lib/kubelet/pods/8f822c2c-795c-4729-971d-b63859faff56/volumes" Oct 03 14:18:49 crc kubenswrapper[4636]: I1003 14:18:49.259881 4636 generic.go:334] "Generic (PLEG): container finished" podID="69fefc36-2141-4a6e-b716-ae8e2e330bb8" containerID="480cac71ecee2e82d7dcabfc7329f93c858f425f9bbe046d72881a5b44ec783c" exitCode=0 Oct 03 14:18:49 crc kubenswrapper[4636]: I1003 14:18:49.259989 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" event={"ID":"69fefc36-2141-4a6e-b716-ae8e2e330bb8","Type":"ContainerDied","Data":"480cac71ecee2e82d7dcabfc7329f93c858f425f9bbe046d72881a5b44ec783c"} Oct 03 14:18:49 crc kubenswrapper[4636]: I1003 14:18:49.260364 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" event={"ID":"69fefc36-2141-4a6e-b716-ae8e2e330bb8","Type":"ContainerStarted","Data":"65338202177fcbb7465eaa3e5f160353d56fbb059cbbe90473127be839834d69"} Oct 03 14:18:50 crc kubenswrapper[4636]: I1003 14:18:50.278322 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mg222" event={"ID":"71c19545-0af9-461d-bf0b-ba0a08f8dbff","Type":"ContainerStarted","Data":"59b63ea0120facc0242d39598a9b996e020bfe2e115a1beb366c46b10289c32b"} Oct 03 14:18:50 crc kubenswrapper[4636]: I1003 14:18:50.280274 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" event={"ID":"69fefc36-2141-4a6e-b716-ae8e2e330bb8","Type":"ContainerStarted","Data":"7d082b8612ef2bda639bc4d2614bddf2d3100f3b657ecc771b1b7ee3d0d2ec42"} Oct 03 14:18:50 crc kubenswrapper[4636]: I1003 14:18:50.280445 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:50 crc kubenswrapper[4636]: I1003 14:18:50.371220 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-mg222" podStartSLOduration=2.115586573 podStartE2EDuration="28.371198122s" podCreationTimestamp="2025-10-03 14:18:22 +0000 UTC" firstStartedPulling="2025-10-03 14:18:23.380639249 +0000 UTC m=+1053.239365496" lastFinishedPulling="2025-10-03 14:18:49.636250798 +0000 UTC m=+1079.494977045" observedRunningTime="2025-10-03 14:18:50.352597173 +0000 UTC m=+1080.211323420" watchObservedRunningTime="2025-10-03 14:18:50.371198122 +0000 UTC m=+1080.229924369" Oct 03 14:18:50 crc kubenswrapper[4636]: I1003 14:18:50.432336 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" podStartSLOduration=3.432317938 podStartE2EDuration="3.432317938s" podCreationTimestamp="2025-10-03 14:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:18:50.431482016 +0000 UTC m=+1080.290208263" watchObservedRunningTime="2025-10-03 14:18:50.432317938 +0000 UTC m=+1080.291044185" Oct 03 14:18:54 crc kubenswrapper[4636]: I1003 14:18:54.310995 4636 generic.go:334] "Generic (PLEG): container finished" podID="71c19545-0af9-461d-bf0b-ba0a08f8dbff" containerID="59b63ea0120facc0242d39598a9b996e020bfe2e115a1beb366c46b10289c32b" exitCode=0 Oct 03 14:18:54 crc kubenswrapper[4636]: I1003 14:18:54.311328 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mg222" event={"ID":"71c19545-0af9-461d-bf0b-ba0a08f8dbff","Type":"ContainerDied","Data":"59b63ea0120facc0242d39598a9b996e020bfe2e115a1beb366c46b10289c32b"} Oct 03 14:18:55 crc kubenswrapper[4636]: I1003 14:18:55.648900 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:55 crc kubenswrapper[4636]: I1003 14:18:55.756608 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-config-data\") pod \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " Oct 03 14:18:55 crc kubenswrapper[4636]: I1003 14:18:55.756665 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2b8j\" (UniqueName: \"kubernetes.io/projected/71c19545-0af9-461d-bf0b-ba0a08f8dbff-kube-api-access-t2b8j\") pod \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " Oct 03 14:18:55 crc kubenswrapper[4636]: I1003 14:18:55.756812 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-combined-ca-bundle\") pod \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\" (UID: \"71c19545-0af9-461d-bf0b-ba0a08f8dbff\") " Oct 03 14:18:55 crc kubenswrapper[4636]: I1003 14:18:55.761979 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c19545-0af9-461d-bf0b-ba0a08f8dbff-kube-api-access-t2b8j" (OuterVolumeSpecName: "kube-api-access-t2b8j") pod "71c19545-0af9-461d-bf0b-ba0a08f8dbff" (UID: "71c19545-0af9-461d-bf0b-ba0a08f8dbff"). InnerVolumeSpecName "kube-api-access-t2b8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:55 crc kubenswrapper[4636]: I1003 14:18:55.784327 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71c19545-0af9-461d-bf0b-ba0a08f8dbff" (UID: "71c19545-0af9-461d-bf0b-ba0a08f8dbff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:18:55 crc kubenswrapper[4636]: I1003 14:18:55.803837 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-config-data" (OuterVolumeSpecName: "config-data") pod "71c19545-0af9-461d-bf0b-ba0a08f8dbff" (UID: "71c19545-0af9-461d-bf0b-ba0a08f8dbff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:18:55 crc kubenswrapper[4636]: I1003 14:18:55.858213 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:55 crc kubenswrapper[4636]: I1003 14:18:55.858250 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c19545-0af9-461d-bf0b-ba0a08f8dbff-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:55 crc kubenswrapper[4636]: I1003 14:18:55.858264 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2b8j\" (UniqueName: \"kubernetes.io/projected/71c19545-0af9-461d-bf0b-ba0a08f8dbff-kube-api-access-t2b8j\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.327756 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mg222" event={"ID":"71c19545-0af9-461d-bf0b-ba0a08f8dbff","Type":"ContainerDied","Data":"f927f28305ae54f83b4b0435a08008ebda8d2c81d241a2669dfa004d5a6ccad9"} Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.328091 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f927f28305ae54f83b4b0435a08008ebda8d2c81d241a2669dfa004d5a6ccad9" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.327833 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mg222" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.531065 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hjczt"] Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.531354 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" podUID="69fefc36-2141-4a6e-b716-ae8e2e330bb8" containerName="dnsmasq-dns" containerID="cri-o://7d082b8612ef2bda639bc4d2614bddf2d3100f3b657ecc771b1b7ee3d0d2ec42" gracePeriod=10 Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.532259 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.546248 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-swxgw"] Oct 03 14:18:56 crc kubenswrapper[4636]: E1003 14:18:56.546797 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f822c2c-795c-4729-971d-b63859faff56" containerName="init" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.546890 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f822c2c-795c-4729-971d-b63859faff56" containerName="init" Oct 03 14:18:56 crc kubenswrapper[4636]: E1003 14:18:56.546960 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c19545-0af9-461d-bf0b-ba0a08f8dbff" containerName="keystone-db-sync" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.547021 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c19545-0af9-461d-bf0b-ba0a08f8dbff" containerName="keystone-db-sync" Oct 03 14:18:56 crc kubenswrapper[4636]: E1003 14:18:56.547133 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f822c2c-795c-4729-971d-b63859faff56" containerName="dnsmasq-dns" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.547200 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f822c2c-795c-4729-971d-b63859faff56" containerName="dnsmasq-dns" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.547493 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c19545-0af9-461d-bf0b-ba0a08f8dbff" containerName="keystone-db-sync" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.547582 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f822c2c-795c-4729-971d-b63859faff56" containerName="dnsmasq-dns" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.548346 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.552952 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.553347 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l2krf" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.553590 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.554072 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.580852 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-swxgw"] Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.641032 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8js2k"] Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.646469 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.682112 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-scripts\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.682168 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-credential-keys\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.682203 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-config-data\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.682248 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-combined-ca-bundle\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.682290 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-fernet-keys\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.682325 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwsms\" (UniqueName: \"kubernetes.io/projected/c7676518-1b64-43b5-85ab-44217dbdfa68-kube-api-access-xwsms\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.718895 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8js2k"] Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.783762 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.784035 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-config-data\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.784249 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-combined-ca-bundle\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.784349 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-config\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.784443 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.784536 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-fernet-keys\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.784655 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwsms\" (UniqueName: \"kubernetes.io/projected/c7676518-1b64-43b5-85ab-44217dbdfa68-kube-api-access-xwsms\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.784779 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.784863 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmns7\" (UniqueName: \"kubernetes.io/projected/09e8b285-eaae-4481-ac48-552c47aef7ab-kube-api-access-rmns7\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.784952 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-scripts\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.785028 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-svc\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.785139 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-credential-keys\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.794613 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-credential-keys\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.794863 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-config-data\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.819527 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-combined-ca-bundle\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.819878 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-fernet-keys\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.841885 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwsms\" (UniqueName: \"kubernetes.io/projected/c7676518-1b64-43b5-85ab-44217dbdfa68-kube-api-access-xwsms\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.845005 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-scripts\") pod \"keystone-bootstrap-swxgw\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.897422 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.897521 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-config\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.897544 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.897597 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.897615 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmns7\" (UniqueName: \"kubernetes.io/projected/09e8b285-eaae-4481-ac48-552c47aef7ab-kube-api-access-rmns7\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.897641 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-svc\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.898838 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-svc\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.899516 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.900052 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-config\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.900643 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.900699 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.908807 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.963596 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-97znq"] Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.968879 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-97znq" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.988129 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bbhhb" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.993759 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmns7\" (UniqueName: \"kubernetes.io/projected/09e8b285-eaae-4481-ac48-552c47aef7ab-kube-api-access-rmns7\") pod \"dnsmasq-dns-847c4cc679-8js2k\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.995749 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 14:18:56 crc kubenswrapper[4636]: I1003 14:18:56.995970 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.000666 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.007170 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6cc7cf4789-zfvq9"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.008631 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.022450 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.022698 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.022991 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.023084 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-7xm5b" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.023193 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-zgg4h"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.024547 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.050443 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.050622 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kxvg7" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.050732 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.058325 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-97znq"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.085863 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cc7cf4789-zfvq9"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.101583 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxgvs\" (UniqueName: \"kubernetes.io/projected/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-kube-api-access-qxgvs\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.101641 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-config\") pod \"neutron-db-sync-97znq\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " pod="openstack/neutron-db-sync-97znq" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.101738 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgnbs\" (UniqueName: \"kubernetes.io/projected/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-kube-api-access-hgnbs\") pod \"neutron-db-sync-97znq\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " pod="openstack/neutron-db-sync-97znq" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.101866 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-config-data\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.101890 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-logs\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.102134 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-combined-ca-bundle\") pod \"neutron-db-sync-97znq\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " pod="openstack/neutron-db-sync-97znq" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.102159 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-horizon-secret-key\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.102276 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-scripts\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.141537 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zgg4h"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.204466 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-scripts\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.204563 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-db-sync-config-data\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.204618 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-combined-ca-bundle\") pod \"neutron-db-sync-97znq\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " pod="openstack/neutron-db-sync-97znq" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.204655 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-horizon-secret-key\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.204703 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-scripts\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.204732 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-combined-ca-bundle\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.204790 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96vlm\" (UniqueName: \"kubernetes.io/projected/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-kube-api-access-96vlm\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.204820 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-config-data\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.204860 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-etc-machine-id\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.204924 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxgvs\" (UniqueName: \"kubernetes.io/projected/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-kube-api-access-qxgvs\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.204957 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-config\") pod \"neutron-db-sync-97znq\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " pod="openstack/neutron-db-sync-97znq" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.205035 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgnbs\" (UniqueName: \"kubernetes.io/projected/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-kube-api-access-hgnbs\") pod \"neutron-db-sync-97znq\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " pod="openstack/neutron-db-sync-97znq" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.205082 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-config-data\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.205132 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-logs\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.205741 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-logs\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.213978 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-scripts\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.223469 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-config-data\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.228530 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-5dc9p"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.230162 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.234739 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-combined-ca-bundle\") pod \"neutron-db-sync-97znq\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " pod="openstack/neutron-db-sync-97znq" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.235368 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-horizon-secret-key\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.243800 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qjlnd" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.249854 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.249886 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-config\") pod \"neutron-db-sync-97znq\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " pod="openstack/neutron-db-sync-97znq" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.249908 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5dc9p"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.274637 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxgvs\" (UniqueName: \"kubernetes.io/projected/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-kube-api-access-qxgvs\") pod \"horizon-6cc7cf4789-zfvq9\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.292264 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgnbs\" (UniqueName: \"kubernetes.io/projected/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-kube-api-access-hgnbs\") pod \"neutron-db-sync-97znq\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " pod="openstack/neutron-db-sync-97znq" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.335532 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-scripts\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.335880 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-db-sync-config-data\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.335982 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-combined-ca-bundle\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.336022 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96vlm\" (UniqueName: \"kubernetes.io/projected/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-kube-api-access-96vlm\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.336064 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-config-data\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.336085 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-etc-machine-id\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.336206 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-etc-machine-id\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.351860 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-scripts\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.382813 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-db-sync-config-data\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.385374 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-config-data\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.408687 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-combined-ca-bundle\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.409952 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8js2k"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.411512 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-97znq" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.425852 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96vlm\" (UniqueName: \"kubernetes.io/projected/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-kube-api-access-96vlm\") pod \"cinder-db-sync-zgg4h\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.438538 4636 generic.go:334] "Generic (PLEG): container finished" podID="69fefc36-2141-4a6e-b716-ae8e2e330bb8" containerID="7d082b8612ef2bda639bc4d2614bddf2d3100f3b657ecc771b1b7ee3d0d2ec42" exitCode=0 Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.438789 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" event={"ID":"69fefc36-2141-4a6e-b716-ae8e2e330bb8","Type":"ContainerDied","Data":"7d082b8612ef2bda639bc4d2614bddf2d3100f3b657ecc771b1b7ee3d0d2ec42"} Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.439229 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.440408 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-combined-ca-bundle\") pod \"barbican-db-sync-5dc9p\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.440461 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-db-sync-config-data\") pod \"barbican-db-sync-5dc9p\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.440522 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h94q\" (UniqueName: \"kubernetes.io/projected/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-kube-api-access-2h94q\") pod \"barbican-db-sync-5dc9p\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.461321 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.461732 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.464051 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.489504 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.489695 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.548977 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.549023 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-config-data\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.549062 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-log-httpd\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.549087 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-run-httpd\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.549118 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-combined-ca-bundle\") pod \"barbican-db-sync-5dc9p\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.549137 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-db-sync-config-data\") pod \"barbican-db-sync-5dc9p\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.549159 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5pgm\" (UniqueName: \"kubernetes.io/projected/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-kube-api-access-l5pgm\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.572823 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-combined-ca-bundle\") pod \"barbican-db-sync-5dc9p\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.573141 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h94q\" (UniqueName: \"kubernetes.io/projected/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-kube-api-access-2h94q\") pod \"barbican-db-sync-5dc9p\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.573180 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-scripts\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.573244 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.573867 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-db-sync-config-data\") pod \"barbican-db-sync-5dc9p\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.574914 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-plqdl"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.576146 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.671258 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-plqdl"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675161 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675226 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675254 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-log-httpd\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675304 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-run-httpd\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675335 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675355 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5pgm\" (UniqueName: \"kubernetes.io/projected/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-kube-api-access-l5pgm\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675380 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675408 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29hxx\" (UniqueName: \"kubernetes.io/projected/68364772-0202-4eb2-9789-c71d42592c48-kube-api-access-29hxx\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675445 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-config\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675476 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-scripts\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675504 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675530 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.675765 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-config-data\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.678752 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-log-httpd\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.679596 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-run-httpd\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.680476 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-config-data\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.692408 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h94q\" (UniqueName: \"kubernetes.io/projected/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-kube-api-access-2h94q\") pod \"barbican-db-sync-5dc9p\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.701601 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.718849 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.722634 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-scripts\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.744994 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5pgm\" (UniqueName: \"kubernetes.io/projected/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-kube-api-access-l5pgm\") pod \"ceilometer-0\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.782186 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29hxx\" (UniqueName: \"kubernetes.io/projected/68364772-0202-4eb2-9789-c71d42592c48-kube-api-access-29hxx\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.782243 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-config\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.782336 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.782360 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.782402 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.782425 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.783251 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.783781 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.784287 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.785461 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-config\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.789012 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.822450 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-hv2wz"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.823476 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.846544 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.846751 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.846860 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lsrxj" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.852542 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.877943 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.883979 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-combined-ca-bundle\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.884164 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-config-data\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.884205 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfcb5\" (UniqueName: \"kubernetes.io/projected/02e452cc-6659-4abd-88ff-d9e731b9b1ef-kube-api-access-dfcb5\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.884226 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-scripts\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.884265 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e452cc-6659-4abd-88ff-d9e731b9b1ef-logs\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.926087 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29hxx\" (UniqueName: \"kubernetes.io/projected/68364772-0202-4eb2-9789-c71d42592c48-kube-api-access-29hxx\") pod \"dnsmasq-dns-785d8bcb8c-plqdl\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.945868 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64994b5457-95phr"] Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.955786 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:57 crc kubenswrapper[4636]: I1003 14:18:57.971648 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:57.988645 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hv2wz"] Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.014978 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-config-data\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.015129 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfcb5\" (UniqueName: \"kubernetes.io/projected/02e452cc-6659-4abd-88ff-d9e731b9b1ef-kube-api-access-dfcb5\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.015163 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-scripts\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.015243 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e452cc-6659-4abd-88ff-d9e731b9b1ef-logs\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.015356 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-combined-ca-bundle\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.015401 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-scripts\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.015688 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a38ad103-c6f8-4570-80f0-c9e7d3577588-logs\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.015728 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a38ad103-c6f8-4570-80f0-c9e7d3577588-horizon-secret-key\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.015796 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6phx\" (UniqueName: \"kubernetes.io/projected/a38ad103-c6f8-4570-80f0-c9e7d3577588-kube-api-access-t6phx\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.015817 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-config-data\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.017144 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e452cc-6659-4abd-88ff-d9e731b9b1ef-logs\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.022549 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-combined-ca-bundle\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.025933 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-config-data\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.031781 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-scripts\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.045508 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64994b5457-95phr"] Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.111270 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.119534 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-scripts\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.119596 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a38ad103-c6f8-4570-80f0-c9e7d3577588-logs\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.119629 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a38ad103-c6f8-4570-80f0-c9e7d3577588-horizon-secret-key\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.119683 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6phx\" (UniqueName: \"kubernetes.io/projected/a38ad103-c6f8-4570-80f0-c9e7d3577588-kube-api-access-t6phx\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.119702 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-config-data\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.135815 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.138140 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-config-data\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.145497 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.147813 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.161478 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a38ad103-c6f8-4570-80f0-c9e7d3577588-logs\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.169256 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-scripts\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.174543 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a38ad103-c6f8-4570-80f0-c9e7d3577588-horizon-secret-key\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.199999 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.201253 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.201602 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.201828 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.202043 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ns26q" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.214276 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.214672 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfcb5\" (UniqueName: \"kubernetes.io/projected/02e452cc-6659-4abd-88ff-d9e731b9b1ef-kube-api-access-dfcb5\") pod \"placement-db-sync-hv2wz\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.218181 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.250418 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.257271 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.322346 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:18:58 crc kubenswrapper[4636]: E1003 14:18:58.323114 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-5npp2 logs scripts], unattached volumes=[], failed to process volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-5npp2 logs scripts]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="c9bd6058-a79d-4837-ba94-f29846dac718" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.375655 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6phx\" (UniqueName: \"kubernetes.io/projected/a38ad103-c6f8-4570-80f0-c9e7d3577588-kube-api-access-t6phx\") pod \"horizon-64994b5457-95phr\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.383244 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.383310 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.383333 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.383363 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.383422 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.383454 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.383474 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.383519 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5npp2\" (UniqueName: \"kubernetes.io/projected/c9bd6058-a79d-4837-ba94-f29846dac718-kube-api-access-5npp2\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.487720 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-nb\") pod \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.487803 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-sb\") pod \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.487854 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-svc\") pod \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.487970 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-config\") pod \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488086 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-swift-storage-0\") pod \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488168 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rq7r\" (UniqueName: \"kubernetes.io/projected/69fefc36-2141-4a6e-b716-ae8e2e330bb8-kube-api-access-2rq7r\") pod \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488425 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488528 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488557 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488585 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488614 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488667 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pblw2\" (UniqueName: \"kubernetes.io/projected/180ad50d-7586-4906-85d4-463e49a589c7-kube-api-access-pblw2\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488695 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488719 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488763 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488788 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5npp2\" (UniqueName: \"kubernetes.io/projected/c9bd6058-a79d-4837-ba94-f29846dac718-kube-api-access-5npp2\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488809 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-logs\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488854 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488906 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488933 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.488981 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.489009 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.520410 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.520720 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-swxgw"] Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.521234 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64994b5457-95phr" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.522301 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.522498 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.529348 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.529955 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hv2wz" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.532951 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.560647 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.561550 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.562294 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.562636 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" event={"ID":"69fefc36-2141-4a6e-b716-ae8e2e330bb8","Type":"ContainerDied","Data":"65338202177fcbb7465eaa3e5f160353d56fbb059cbbe90473127be839834d69"} Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.563648 4636 scope.go:117] "RemoveContainer" containerID="7d082b8612ef2bda639bc4d2614bddf2d3100f3b657ecc771b1b7ee3d0d2ec42" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.593386 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.597325 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69fefc36-2141-4a6e-b716-ae8e2e330bb8-kube-api-access-2rq7r" (OuterVolumeSpecName: "kube-api-access-2rq7r") pod "69fefc36-2141-4a6e-b716-ae8e2e330bb8" (UID: "69fefc36-2141-4a6e-b716-ae8e2e330bb8"). InnerVolumeSpecName "kube-api-access-2rq7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.599569 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.600046 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rq7r\" (UniqueName: \"kubernetes.io/projected/69fefc36-2141-4a6e-b716-ae8e2e330bb8-kube-api-access-2rq7r\") pod \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\" (UID: \"69fefc36-2141-4a6e-b716-ae8e2e330bb8\") " Oct 03 14:18:58 crc kubenswrapper[4636]: W1003 14:18:58.600253 4636 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/69fefc36-2141-4a6e-b716-ae8e2e330bb8/volumes/kubernetes.io~projected/kube-api-access-2rq7r Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.600305 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69fefc36-2141-4a6e-b716-ae8e2e330bb8-kube-api-access-2rq7r" (OuterVolumeSpecName: "kube-api-access-2rq7r") pod "69fefc36-2141-4a6e-b716-ae8e2e330bb8" (UID: "69fefc36-2141-4a6e-b716-ae8e2e330bb8"). InnerVolumeSpecName "kube-api-access-2rq7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.602584 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5npp2\" (UniqueName: \"kubernetes.io/projected/c9bd6058-a79d-4837-ba94-f29846dac718-kube-api-access-5npp2\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.603049 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.603613 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pblw2\" (UniqueName: \"kubernetes.io/projected/180ad50d-7586-4906-85d4-463e49a589c7-kube-api-access-pblw2\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.603650 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.603856 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.604093 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.604219 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-logs\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.604310 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.604336 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.604631 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rq7r\" (UniqueName: \"kubernetes.io/projected/69fefc36-2141-4a6e-b716-ae8e2e330bb8-kube-api-access-2rq7r\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.605327 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.607343 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.607843 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-logs\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.609267 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.622655 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.640725 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.657163 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.657787 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.672631 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pblw2\" (UniqueName: \"kubernetes.io/projected/180ad50d-7586-4906-85d4-463e49a589c7-kube-api-access-pblw2\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.686616 4636 scope.go:117] "RemoveContainer" containerID="480cac71ecee2e82d7dcabfc7329f93c858f425f9bbe046d72881a5b44ec783c" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.706857 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-combined-ca-bundle\") pod \"c9bd6058-a79d-4837-ba94-f29846dac718\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.706895 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-logs\") pod \"c9bd6058-a79d-4837-ba94-f29846dac718\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.706945 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-httpd-run\") pod \"c9bd6058-a79d-4837-ba94-f29846dac718\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.707022 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-scripts\") pod \"c9bd6058-a79d-4837-ba94-f29846dac718\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.707150 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-config-data\") pod \"c9bd6058-a79d-4837-ba94-f29846dac718\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.707166 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-internal-tls-certs\") pod \"c9bd6058-a79d-4837-ba94-f29846dac718\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.707203 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5npp2\" (UniqueName: \"kubernetes.io/projected/c9bd6058-a79d-4837-ba94-f29846dac718-kube-api-access-5npp2\") pod \"c9bd6058-a79d-4837-ba94-f29846dac718\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.707649 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-logs" (OuterVolumeSpecName: "logs") pod "c9bd6058-a79d-4837-ba94-f29846dac718" (UID: "c9bd6058-a79d-4837-ba94-f29846dac718"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.707952 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.718495 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c9bd6058-a79d-4837-ba94-f29846dac718" (UID: "c9bd6058-a79d-4837-ba94-f29846dac718"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.732575 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9bd6058-a79d-4837-ba94-f29846dac718" (UID: "c9bd6058-a79d-4837-ba94-f29846dac718"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.737945 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-scripts" (OuterVolumeSpecName: "scripts") pod "c9bd6058-a79d-4837-ba94-f29846dac718" (UID: "c9bd6058-a79d-4837-ba94-f29846dac718"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.737955 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9bd6058-a79d-4837-ba94-f29846dac718-kube-api-access-5npp2" (OuterVolumeSpecName: "kube-api-access-5npp2") pod "c9bd6058-a79d-4837-ba94-f29846dac718" (UID: "c9bd6058-a79d-4837-ba94-f29846dac718"). InnerVolumeSpecName "kube-api-access-5npp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.741618 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-config-data" (OuterVolumeSpecName: "config-data") pod "c9bd6058-a79d-4837-ba94-f29846dac718" (UID: "c9bd6058-a79d-4837-ba94-f29846dac718"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.747247 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c9bd6058-a79d-4837-ba94-f29846dac718" (UID: "c9bd6058-a79d-4837-ba94-f29846dac718"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.810856 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.811181 4636 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.811199 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5npp2\" (UniqueName: \"kubernetes.io/projected/c9bd6058-a79d-4837-ba94-f29846dac718-kube-api-access-5npp2\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.811211 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.811228 4636 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9bd6058-a79d-4837-ba94-f29846dac718-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.811240 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9bd6058-a79d-4837-ba94-f29846dac718-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.936609 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:18:58 crc kubenswrapper[4636]: I1003 14:18:58.995465 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-config" (OuterVolumeSpecName: "config") pod "69fefc36-2141-4a6e-b716-ae8e2e330bb8" (UID: "69fefc36-2141-4a6e-b716-ae8e2e330bb8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.055265 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.056407 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.063494 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-97znq"] Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.067554 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "69fefc36-2141-4a6e-b716-ae8e2e330bb8" (UID: "69fefc36-2141-4a6e-b716-ae8e2e330bb8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.093293 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69fefc36-2141-4a6e-b716-ae8e2e330bb8" (UID: "69fefc36-2141-4a6e-b716-ae8e2e330bb8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.130370 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8js2k"] Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.156759 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c9bd6058-a79d-4837-ba94-f29846dac718\" (UID: \"c9bd6058-a79d-4837-ba94-f29846dac718\") " Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.157317 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.157337 4636 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.169069 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "c9bd6058-a79d-4837-ba94-f29846dac718" (UID: "c9bd6058-a79d-4837-ba94-f29846dac718"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.194646 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.235194 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69fefc36-2141-4a6e-b716-ae8e2e330bb8" (UID: "69fefc36-2141-4a6e-b716-ae8e2e330bb8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.259403 4636 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.259474 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.286837 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69fefc36-2141-4a6e-b716-ae8e2e330bb8" (UID: "69fefc36-2141-4a6e-b716-ae8e2e330bb8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.350210 4636 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.376188 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.388244 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fefc36-2141-4a6e-b716-ae8e2e330bb8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.388388 4636 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.486626 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-plqdl"] Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.544614 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zgg4h"] Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.620303 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hjczt"] Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.625674 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-97znq" event={"ID":"7d2a38ef-2fad-4a66-a131-2f690ceb72f1","Type":"ContainerStarted","Data":"9e0e7f5e08ba646f03fbb03f2233f2e2ec5fda676b9dcecd70f12618c0182f05"} Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.650363 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hjczt"] Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.656466 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-swxgw" event={"ID":"c7676518-1b64-43b5-85ab-44217dbdfa68","Type":"ContainerStarted","Data":"7cc5445780c2bc60b13e6c4386f884c0cc76b440675901fee80ec24a5ab19745"} Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.656499 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-swxgw" event={"ID":"c7676518-1b64-43b5-85ab-44217dbdfa68","Type":"ContainerStarted","Data":"d0219683a17a60be8865249bcac2cf04d5e32064127434dde818d67be23ef6cc"} Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.675403 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ba9d4a1-300f-4367-ba2e-528ed4635dfd","Type":"ContainerStarted","Data":"b0705a02839f38e6a5797d69aecce8e13f3afd05a7971c0f43da1d0795b0bc90"} Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.698917 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-8js2k" event={"ID":"09e8b285-eaae-4481-ac48-552c47aef7ab","Type":"ContainerStarted","Data":"5b491feef3069c9f9d92d9a93516e1e2e43273f7bfc2f5a12259a7d4fb17a01a"} Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.720404 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" event={"ID":"68364772-0202-4eb2-9789-c71d42592c48","Type":"ContainerStarted","Data":"25e794b20be9e591c985f28b577c8801090035fd6a79a98a809b03e566b08e50"} Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.723775 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.728739 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cc7cf4789-zfvq9"] Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.739519 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5dc9p"] Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.790626 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hv2wz"] Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.803540 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-swxgw" podStartSLOduration=3.80351681 podStartE2EDuration="3.80351681s" podCreationTimestamp="2025-10-03 14:18:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:18:59.685237931 +0000 UTC m=+1089.543964178" watchObservedRunningTime="2025-10-03 14:18:59.80351681 +0000 UTC m=+1089.662243057" Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.860927 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64994b5457-95phr"] Oct 03 14:18:59 crc kubenswrapper[4636]: I1003 14:18:59.926056 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.040996 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.079469 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.090203 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:19:00 crc kubenswrapper[4636]: E1003 14:19:00.090690 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69fefc36-2141-4a6e-b716-ae8e2e330bb8" containerName="dnsmasq-dns" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.090712 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="69fefc36-2141-4a6e-b716-ae8e2e330bb8" containerName="dnsmasq-dns" Oct 03 14:19:00 crc kubenswrapper[4636]: E1003 14:19:00.090736 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69fefc36-2141-4a6e-b716-ae8e2e330bb8" containerName="init" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.090743 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="69fefc36-2141-4a6e-b716-ae8e2e330bb8" containerName="init" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.098641 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="69fefc36-2141-4a6e-b716-ae8e2e330bb8" containerName="dnsmasq-dns" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.101075 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.101268 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.109650 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.109833 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.237166 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.237243 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-logs\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.237277 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc5vc\" (UniqueName: \"kubernetes.io/projected/01e51d41-560a-474b-9b2e-09d2c267878c-kube-api-access-cc5vc\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.237341 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.237405 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.237441 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.237472 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.237527 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.339681 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.339758 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.339785 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.340049 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.340087 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.340387 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.340455 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-logs\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.340481 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc5vc\" (UniqueName: \"kubernetes.io/projected/01e51d41-560a-474b-9b2e-09d2c267878c-kube-api-access-cc5vc\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.342308 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.342849 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.342304 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-logs\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.347475 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.347932 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.349066 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.363490 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.388940 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc5vc\" (UniqueName: \"kubernetes.io/projected/01e51d41-560a-474b-9b2e-09d2c267878c-kube-api-access-cc5vc\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.421228 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.478431 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.784183 4636 generic.go:334] "Generic (PLEG): container finished" podID="68364772-0202-4eb2-9789-c71d42592c48" containerID="73c23f8270bd49ca9b9bb25c9757f812b2da434f1c211827c2b1a7100eed943e" exitCode=0 Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.799725 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" event={"ID":"68364772-0202-4eb2-9789-c71d42592c48","Type":"ContainerDied","Data":"73c23f8270bd49ca9b9bb25c9757f812b2da434f1c211827c2b1a7100eed943e"} Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.914676 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69fefc36-2141-4a6e-b716-ae8e2e330bb8" path="/var/lib/kubelet/pods/69fefc36-2141-4a6e-b716-ae8e2e330bb8/volumes" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.923359 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9bd6058-a79d-4837-ba94-f29846dac718" path="/var/lib/kubelet/pods/c9bd6058-a79d-4837-ba94-f29846dac718/volumes" Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.941805 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hv2wz" event={"ID":"02e452cc-6659-4abd-88ff-d9e731b9b1ef","Type":"ContainerStarted","Data":"260bb49d2f7974e7ebca0e537a869bd83d389cd27321b60cd5b91aa6ec1bf83c"} Oct 03 14:19:00 crc kubenswrapper[4636]: I1003 14:19:00.977515 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zgg4h" event={"ID":"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49","Type":"ContainerStarted","Data":"73ac79edc3ecc3c312381d76c522466fc40b245e8886c7a1c65811884a02913f"} Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.003508 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"180ad50d-7586-4906-85d4-463e49a589c7","Type":"ContainerStarted","Data":"64906872ac3aa715da638649660f9971c8c7daaff2a3ed5591fff5c90d894d40"} Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.004627 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5dc9p" event={"ID":"0eb04b62-9d0b-4dda-aff1-022bed4af5b4","Type":"ContainerStarted","Data":"75786b3cc242fbdb05787fd9d6e3a668f9fb301f1f00b3436c46aba2dbcd68a2"} Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.007814 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cc7cf4789-zfvq9" event={"ID":"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f","Type":"ContainerStarted","Data":"e14bd899f89327f91a219a7032b64e0cffef887cece58496c2906ccfe7abb3f7"} Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.031114 4636 generic.go:334] "Generic (PLEG): container finished" podID="09e8b285-eaae-4481-ac48-552c47aef7ab" containerID="1f51906631f19c0fe68422309a37712890fbe69ca2411421778d4a01b0d7fad6" exitCode=0 Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.031208 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-8js2k" event={"ID":"09e8b285-eaae-4481-ac48-552c47aef7ab","Type":"ContainerDied","Data":"1f51906631f19c0fe68422309a37712890fbe69ca2411421778d4a01b0d7fad6"} Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.044259 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64994b5457-95phr" event={"ID":"a38ad103-c6f8-4570-80f0-c9e7d3577588","Type":"ContainerStarted","Data":"c8219238ae112e4ff7e6da49a2e734d841a2b987526bd0f7baa9bee5069dc02b"} Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.105073 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-97znq" event={"ID":"7d2a38ef-2fad-4a66-a131-2f690ceb72f1","Type":"ContainerStarted","Data":"ea54d616c2574efe0fca015fd7163ec725f13d0b9b15a84b7a1a06c0339a7c22"} Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.402756 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cc7cf4789-zfvq9"] Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.421235 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.443641 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c8488ff4f-4mhlb"] Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.445229 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.524139 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c8488ff4f-4mhlb"] Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.584585 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-scripts\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.584906 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-config-data\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.584987 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f783ef7-df5d-40c8-968d-f965ec7c78e6-logs\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.585058 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwd7z\" (UniqueName: \"kubernetes.io/projected/4f783ef7-df5d-40c8-968d-f965ec7c78e6-kube-api-access-mwd7z\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.585143 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f783ef7-df5d-40c8-968d-f965ec7c78e6-horizon-secret-key\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.651897 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.689677 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-config-data\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.689798 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f783ef7-df5d-40c8-968d-f965ec7c78e6-logs\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.689854 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwd7z\" (UniqueName: \"kubernetes.io/projected/4f783ef7-df5d-40c8-968d-f965ec7c78e6-kube-api-access-mwd7z\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.689921 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f783ef7-df5d-40c8-968d-f965ec7c78e6-horizon-secret-key\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.689968 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-scripts\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.690856 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-scripts\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.692008 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-config-data\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.692882 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f783ef7-df5d-40c8-968d-f965ec7c78e6-logs\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.722846 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f783ef7-df5d-40c8-968d-f965ec7c78e6-horizon-secret-key\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.765452 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwd7z\" (UniqueName: \"kubernetes.io/projected/4f783ef7-df5d-40c8-968d-f965ec7c78e6-kube-api-access-mwd7z\") pod \"horizon-5c8488ff4f-4mhlb\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.779962 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.798623 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.892195 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:19:01 crc kubenswrapper[4636]: I1003 14:19:01.918449 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-97znq" podStartSLOduration=5.918433698 podStartE2EDuration="5.918433698s" podCreationTimestamp="2025-10-03 14:18:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:01.917866603 +0000 UTC m=+1091.776592850" watchObservedRunningTime="2025-10-03 14:19:01.918433698 +0000 UTC m=+1091.777159945" Oct 03 14:19:02 crc kubenswrapper[4636]: W1003 14:19:02.071450 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01e51d41_560a_474b_9b2e_09d2c267878c.slice/crio-a27bb89cd0df39289e8d5164bca85610b34998b1e8ac07bcdd65616af630c2d5 WatchSource:0}: Error finding container a27bb89cd0df39289e8d5164bca85610b34998b1e8ac07bcdd65616af630c2d5: Status 404 returned error can't find the container with id a27bb89cd0df39289e8d5164bca85610b34998b1e8ac07bcdd65616af630c2d5 Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.170214 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01e51d41-560a-474b-9b2e-09d2c267878c","Type":"ContainerStarted","Data":"a27bb89cd0df39289e8d5164bca85610b34998b1e8ac07bcdd65616af630c2d5"} Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.292468 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.453772 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmns7\" (UniqueName: \"kubernetes.io/projected/09e8b285-eaae-4481-ac48-552c47aef7ab-kube-api-access-rmns7\") pod \"09e8b285-eaae-4481-ac48-552c47aef7ab\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.454137 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-nb\") pod \"09e8b285-eaae-4481-ac48-552c47aef7ab\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.454292 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-config\") pod \"09e8b285-eaae-4481-ac48-552c47aef7ab\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.454326 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-sb\") pod \"09e8b285-eaae-4481-ac48-552c47aef7ab\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.454549 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-svc\") pod \"09e8b285-eaae-4481-ac48-552c47aef7ab\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.454597 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-swift-storage-0\") pod \"09e8b285-eaae-4481-ac48-552c47aef7ab\" (UID: \"09e8b285-eaae-4481-ac48-552c47aef7ab\") " Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.482871 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09e8b285-eaae-4481-ac48-552c47aef7ab-kube-api-access-rmns7" (OuterVolumeSpecName: "kube-api-access-rmns7") pod "09e8b285-eaae-4481-ac48-552c47aef7ab" (UID: "09e8b285-eaae-4481-ac48-552c47aef7ab"). InnerVolumeSpecName "kube-api-access-rmns7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.500758 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "09e8b285-eaae-4481-ac48-552c47aef7ab" (UID: "09e8b285-eaae-4481-ac48-552c47aef7ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.516491 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "09e8b285-eaae-4481-ac48-552c47aef7ab" (UID: "09e8b285-eaae-4481-ac48-552c47aef7ab"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.537281 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "09e8b285-eaae-4481-ac48-552c47aef7ab" (UID: "09e8b285-eaae-4481-ac48-552c47aef7ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.541608 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-config" (OuterVolumeSpecName: "config") pod "09e8b285-eaae-4481-ac48-552c47aef7ab" (UID: "09e8b285-eaae-4481-ac48-552c47aef7ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.559989 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "09e8b285-eaae-4481-ac48-552c47aef7ab" (UID: "09e8b285-eaae-4481-ac48-552c47aef7ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.561491 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.561514 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.561527 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.561538 4636 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.561549 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmns7\" (UniqueName: \"kubernetes.io/projected/09e8b285-eaae-4481-ac48-552c47aef7ab-kube-api-access-rmns7\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.561560 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09e8b285-eaae-4481-ac48-552c47aef7ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:02 crc kubenswrapper[4636]: I1003 14:19:02.957422 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c8488ff4f-4mhlb"] Oct 03 14:19:03 crc kubenswrapper[4636]: I1003 14:19:03.011664 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-hjczt" podUID="69fefc36-2141-4a6e-b716-ae8e2e330bb8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: i/o timeout" Oct 03 14:19:03 crc kubenswrapper[4636]: I1003 14:19:03.203374 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c8488ff4f-4mhlb" event={"ID":"4f783ef7-df5d-40c8-968d-f965ec7c78e6","Type":"ContainerStarted","Data":"5e8e72331e3cad531e1c3c5eaba36d23ead8f4d9551a3bc6ecfd857067dff972"} Oct 03 14:19:03 crc kubenswrapper[4636]: I1003 14:19:03.210446 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"180ad50d-7586-4906-85d4-463e49a589c7","Type":"ContainerStarted","Data":"bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18"} Oct 03 14:19:03 crc kubenswrapper[4636]: I1003 14:19:03.223433 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-8js2k" event={"ID":"09e8b285-eaae-4481-ac48-552c47aef7ab","Type":"ContainerDied","Data":"5b491feef3069c9f9d92d9a93516e1e2e43273f7bfc2f5a12259a7d4fb17a01a"} Oct 03 14:19:03 crc kubenswrapper[4636]: I1003 14:19:03.223485 4636 scope.go:117] "RemoveContainer" containerID="1f51906631f19c0fe68422309a37712890fbe69ca2411421778d4a01b0d7fad6" Oct 03 14:19:03 crc kubenswrapper[4636]: I1003 14:19:03.223607 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-8js2k" Oct 03 14:19:03 crc kubenswrapper[4636]: I1003 14:19:03.246305 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" event={"ID":"68364772-0202-4eb2-9789-c71d42592c48","Type":"ContainerStarted","Data":"d495228831ff9b98b3dc64192e48a132b4753e094f0484c2f0acaca216cecf9a"} Oct 03 14:19:03 crc kubenswrapper[4636]: I1003 14:19:03.247396 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:19:03 crc kubenswrapper[4636]: I1003 14:19:03.290560 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8js2k"] Oct 03 14:19:03 crc kubenswrapper[4636]: I1003 14:19:03.298958 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8js2k"] Oct 03 14:19:03 crc kubenswrapper[4636]: I1003 14:19:03.340510 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" podStartSLOduration=6.340489916 podStartE2EDuration="6.340489916s" podCreationTimestamp="2025-10-03 14:18:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:03.329858362 +0000 UTC m=+1093.188584609" watchObservedRunningTime="2025-10-03 14:19:03.340489916 +0000 UTC m=+1093.199216163" Oct 03 14:19:04 crc kubenswrapper[4636]: I1003 14:19:04.294036 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"180ad50d-7586-4906-85d4-463e49a589c7","Type":"ContainerStarted","Data":"29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada"} Oct 03 14:19:04 crc kubenswrapper[4636]: I1003 14:19:04.294577 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="180ad50d-7586-4906-85d4-463e49a589c7" containerName="glance-log" containerID="cri-o://bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18" gracePeriod=30 Oct 03 14:19:04 crc kubenswrapper[4636]: I1003 14:19:04.295382 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="180ad50d-7586-4906-85d4-463e49a589c7" containerName="glance-httpd" containerID="cri-o://29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada" gracePeriod=30 Oct 03 14:19:04 crc kubenswrapper[4636]: I1003 14:19:04.304937 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01e51d41-560a-474b-9b2e-09d2c267878c","Type":"ContainerStarted","Data":"1ec99c0577d80ea99a219b6f5ef3668258f693c6e76c3d4228b3f06133b372ba"} Oct 03 14:19:04 crc kubenswrapper[4636]: I1003 14:19:04.321620 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.321600137 podStartE2EDuration="7.321600137s" podCreationTimestamp="2025-10-03 14:18:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:04.316374862 +0000 UTC m=+1094.175101119" watchObservedRunningTime="2025-10-03 14:19:04.321600137 +0000 UTC m=+1094.180326384" Oct 03 14:19:04 crc kubenswrapper[4636]: I1003 14:19:04.840249 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09e8b285-eaae-4481-ac48-552c47aef7ab" path="/var/lib/kubelet/pods/09e8b285-eaae-4481-ac48-552c47aef7ab/volumes" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.116161 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.245217 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"180ad50d-7586-4906-85d4-463e49a589c7\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.245396 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-logs\") pod \"180ad50d-7586-4906-85d4-463e49a589c7\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.245520 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-scripts\") pod \"180ad50d-7586-4906-85d4-463e49a589c7\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.245552 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-public-tls-certs\") pod \"180ad50d-7586-4906-85d4-463e49a589c7\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.245580 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-httpd-run\") pod \"180ad50d-7586-4906-85d4-463e49a589c7\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.245648 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pblw2\" (UniqueName: \"kubernetes.io/projected/180ad50d-7586-4906-85d4-463e49a589c7-kube-api-access-pblw2\") pod \"180ad50d-7586-4906-85d4-463e49a589c7\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.245700 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-combined-ca-bundle\") pod \"180ad50d-7586-4906-85d4-463e49a589c7\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.245784 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-config-data\") pod \"180ad50d-7586-4906-85d4-463e49a589c7\" (UID: \"180ad50d-7586-4906-85d4-463e49a589c7\") " Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.248153 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "180ad50d-7586-4906-85d4-463e49a589c7" (UID: "180ad50d-7586-4906-85d4-463e49a589c7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.252977 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-logs" (OuterVolumeSpecName: "logs") pod "180ad50d-7586-4906-85d4-463e49a589c7" (UID: "180ad50d-7586-4906-85d4-463e49a589c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.271361 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180ad50d-7586-4906-85d4-463e49a589c7-kube-api-access-pblw2" (OuterVolumeSpecName: "kube-api-access-pblw2") pod "180ad50d-7586-4906-85d4-463e49a589c7" (UID: "180ad50d-7586-4906-85d4-463e49a589c7"). InnerVolumeSpecName "kube-api-access-pblw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.271543 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-scripts" (OuterVolumeSpecName: "scripts") pod "180ad50d-7586-4906-85d4-463e49a589c7" (UID: "180ad50d-7586-4906-85d4-463e49a589c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.287331 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "180ad50d-7586-4906-85d4-463e49a589c7" (UID: "180ad50d-7586-4906-85d4-463e49a589c7"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.330989 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "180ad50d-7586-4906-85d4-463e49a589c7" (UID: "180ad50d-7586-4906-85d4-463e49a589c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.349597 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.349639 4636 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.349652 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pblw2\" (UniqueName: \"kubernetes.io/projected/180ad50d-7586-4906-85d4-463e49a589c7-kube-api-access-pblw2\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.349666 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.349696 4636 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.349708 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/180ad50d-7586-4906-85d4-463e49a589c7-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.358353 4636 generic.go:334] "Generic (PLEG): container finished" podID="180ad50d-7586-4906-85d4-463e49a589c7" containerID="29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada" exitCode=143 Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.358384 4636 generic.go:334] "Generic (PLEG): container finished" podID="180ad50d-7586-4906-85d4-463e49a589c7" containerID="bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18" exitCode=143 Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.358439 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"180ad50d-7586-4906-85d4-463e49a589c7","Type":"ContainerDied","Data":"29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada"} Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.358483 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"180ad50d-7586-4906-85d4-463e49a589c7","Type":"ContainerDied","Data":"bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18"} Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.358493 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"180ad50d-7586-4906-85d4-463e49a589c7","Type":"ContainerDied","Data":"64906872ac3aa715da638649660f9971c8c7daaff2a3ed5591fff5c90d894d40"} Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.358508 4636 scope.go:117] "RemoveContainer" containerID="29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.358644 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.378417 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="01e51d41-560a-474b-9b2e-09d2c267878c" containerName="glance-log" containerID="cri-o://1ec99c0577d80ea99a219b6f5ef3668258f693c6e76c3d4228b3f06133b372ba" gracePeriod=30 Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.378518 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01e51d41-560a-474b-9b2e-09d2c267878c","Type":"ContainerStarted","Data":"8fe8625f5d4e89b4a7ac3abd92ddb51ac202b92725a54e4f5f0c5d6200af16c0"} Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.379143 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="01e51d41-560a-474b-9b2e-09d2c267878c" containerName="glance-httpd" containerID="cri-o://8fe8625f5d4e89b4a7ac3abd92ddb51ac202b92725a54e4f5f0c5d6200af16c0" gracePeriod=30 Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.432352 4636 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.444705 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.444682519 podStartE2EDuration="5.444682519s" podCreationTimestamp="2025-10-03 14:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:05.423287658 +0000 UTC m=+1095.282013905" watchObservedRunningTime="2025-10-03 14:19:05.444682519 +0000 UTC m=+1095.303408766" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.451339 4636 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.465366 4636 scope.go:117] "RemoveContainer" containerID="bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.481326 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-config-data" (OuterVolumeSpecName: "config-data") pod "180ad50d-7586-4906-85d4-463e49a589c7" (UID: "180ad50d-7586-4906-85d4-463e49a589c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.481443 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "180ad50d-7586-4906-85d4-463e49a589c7" (UID: "180ad50d-7586-4906-85d4-463e49a589c7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.546763 4636 scope.go:117] "RemoveContainer" containerID="29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada" Oct 03 14:19:05 crc kubenswrapper[4636]: E1003 14:19:05.553723 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada\": container with ID starting with 29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada not found: ID does not exist" containerID="29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.553770 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada"} err="failed to get container status \"29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada\": rpc error: code = NotFound desc = could not find container \"29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada\": container with ID starting with 29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada not found: ID does not exist" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.553798 4636 scope.go:117] "RemoveContainer" containerID="bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18" Oct 03 14:19:05 crc kubenswrapper[4636]: E1003 14:19:05.554734 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18\": container with ID starting with bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18 not found: ID does not exist" containerID="bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.554772 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18"} err="failed to get container status \"bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18\": rpc error: code = NotFound desc = could not find container \"bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18\": container with ID starting with bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18 not found: ID does not exist" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.554796 4636 scope.go:117] "RemoveContainer" containerID="29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.555019 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.555063 4636 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/180ad50d-7586-4906-85d4-463e49a589c7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.555192 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada"} err="failed to get container status \"29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada\": rpc error: code = NotFound desc = could not find container \"29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada\": container with ID starting with 29105bd946cb053dc362afe570c3a986e3b13c26547f03cb219be90d018a3ada not found: ID does not exist" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.555210 4636 scope.go:117] "RemoveContainer" containerID="bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.555529 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18"} err="failed to get container status \"bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18\": rpc error: code = NotFound desc = could not find container \"bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18\": container with ID starting with bd325a24bd3f8724d777f89d5607c93f3464012688e0614f770d512cfdeb7c18 not found: ID does not exist" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.715183 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.725227 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.743516 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:05 crc kubenswrapper[4636]: E1003 14:19:05.743891 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180ad50d-7586-4906-85d4-463e49a589c7" containerName="glance-log" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.743907 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="180ad50d-7586-4906-85d4-463e49a589c7" containerName="glance-log" Oct 03 14:19:05 crc kubenswrapper[4636]: E1003 14:19:05.743923 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180ad50d-7586-4906-85d4-463e49a589c7" containerName="glance-httpd" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.743931 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="180ad50d-7586-4906-85d4-463e49a589c7" containerName="glance-httpd" Oct 03 14:19:05 crc kubenswrapper[4636]: E1003 14:19:05.743943 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e8b285-eaae-4481-ac48-552c47aef7ab" containerName="init" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.743949 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e8b285-eaae-4481-ac48-552c47aef7ab" containerName="init" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.744112 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="09e8b285-eaae-4481-ac48-552c47aef7ab" containerName="init" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.744139 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="180ad50d-7586-4906-85d4-463e49a589c7" containerName="glance-log" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.744152 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="180ad50d-7586-4906-85d4-463e49a589c7" containerName="glance-httpd" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.745046 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.750011 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.750228 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.775020 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.871918 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.871993 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-logs\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.872071 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8dtq\" (UniqueName: \"kubernetes.io/projected/16535f8a-4be6-4805-b69a-667593095ad0-kube-api-access-x8dtq\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.872111 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.872141 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.872182 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.872214 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.872233 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.973678 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.973993 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-logs\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.974050 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8dtq\" (UniqueName: \"kubernetes.io/projected/16535f8a-4be6-4805-b69a-667593095ad0-kube-api-access-x8dtq\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.974070 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.974122 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.974163 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.974211 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.974237 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.977335 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.977558 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-logs\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.977680 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.980251 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.981329 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.985471 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:05 crc kubenswrapper[4636]: I1003 14:19:05.986004 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.006860 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8dtq\" (UniqueName: \"kubernetes.io/projected/16535f8a-4be6-4805-b69a-667593095ad0-kube-api-access-x8dtq\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.022695 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.111209 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.408932 4636 generic.go:334] "Generic (PLEG): container finished" podID="01e51d41-560a-474b-9b2e-09d2c267878c" containerID="8fe8625f5d4e89b4a7ac3abd92ddb51ac202b92725a54e4f5f0c5d6200af16c0" exitCode=143 Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.409248 4636 generic.go:334] "Generic (PLEG): container finished" podID="01e51d41-560a-474b-9b2e-09d2c267878c" containerID="1ec99c0577d80ea99a219b6f5ef3668258f693c6e76c3d4228b3f06133b372ba" exitCode=143 Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.409020 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01e51d41-560a-474b-9b2e-09d2c267878c","Type":"ContainerDied","Data":"8fe8625f5d4e89b4a7ac3abd92ddb51ac202b92725a54e4f5f0c5d6200af16c0"} Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.409303 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01e51d41-560a-474b-9b2e-09d2c267878c","Type":"ContainerDied","Data":"1ec99c0577d80ea99a219b6f5ef3668258f693c6e76c3d4228b3f06133b372ba"} Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.850621 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="180ad50d-7586-4906-85d4-463e49a589c7" path="/var/lib/kubelet/pods/180ad50d-7586-4906-85d4-463e49a589c7/volumes" Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.884417 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.947912 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.996000 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc5vc\" (UniqueName: \"kubernetes.io/projected/01e51d41-560a-474b-9b2e-09d2c267878c-kube-api-access-cc5vc\") pod \"01e51d41-560a-474b-9b2e-09d2c267878c\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.996092 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-logs\") pod \"01e51d41-560a-474b-9b2e-09d2c267878c\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.996175 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-internal-tls-certs\") pod \"01e51d41-560a-474b-9b2e-09d2c267878c\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.996236 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"01e51d41-560a-474b-9b2e-09d2c267878c\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.996340 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-config-data\") pod \"01e51d41-560a-474b-9b2e-09d2c267878c\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.996441 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-httpd-run\") pod \"01e51d41-560a-474b-9b2e-09d2c267878c\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.996464 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-combined-ca-bundle\") pod \"01e51d41-560a-474b-9b2e-09d2c267878c\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " Oct 03 14:19:06 crc kubenswrapper[4636]: I1003 14:19:06.996525 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-scripts\") pod \"01e51d41-560a-474b-9b2e-09d2c267878c\" (UID: \"01e51d41-560a-474b-9b2e-09d2c267878c\") " Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.001463 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-logs" (OuterVolumeSpecName: "logs") pod "01e51d41-560a-474b-9b2e-09d2c267878c" (UID: "01e51d41-560a-474b-9b2e-09d2c267878c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.001647 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "01e51d41-560a-474b-9b2e-09d2c267878c" (UID: "01e51d41-560a-474b-9b2e-09d2c267878c"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.002048 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "01e51d41-560a-474b-9b2e-09d2c267878c" (UID: "01e51d41-560a-474b-9b2e-09d2c267878c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.002801 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-scripts" (OuterVolumeSpecName: "scripts") pod "01e51d41-560a-474b-9b2e-09d2c267878c" (UID: "01e51d41-560a-474b-9b2e-09d2c267878c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.006248 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e51d41-560a-474b-9b2e-09d2c267878c-kube-api-access-cc5vc" (OuterVolumeSpecName: "kube-api-access-cc5vc") pod "01e51d41-560a-474b-9b2e-09d2c267878c" (UID: "01e51d41-560a-474b-9b2e-09d2c267878c"). InnerVolumeSpecName "kube-api-access-cc5vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.049062 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01e51d41-560a-474b-9b2e-09d2c267878c" (UID: "01e51d41-560a-474b-9b2e-09d2c267878c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.077285 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "01e51d41-560a-474b-9b2e-09d2c267878c" (UID: "01e51d41-560a-474b-9b2e-09d2c267878c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.098439 4636 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.098468 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.098480 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.098490 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc5vc\" (UniqueName: \"kubernetes.io/projected/01e51d41-560a-474b-9b2e-09d2c267878c-kube-api-access-cc5vc\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.098499 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01e51d41-560a-474b-9b2e-09d2c267878c-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.098508 4636 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.098529 4636 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.123983 4636 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.150130 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-config-data" (OuterVolumeSpecName: "config-data") pod "01e51d41-560a-474b-9b2e-09d2c267878c" (UID: "01e51d41-560a-474b-9b2e-09d2c267878c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.208042 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e51d41-560a-474b-9b2e-09d2c267878c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.208077 4636 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.311028 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64994b5457-95phr"] Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.339546 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7976d47688-kx5v5"] Oct 03 14:19:07 crc kubenswrapper[4636]: E1003 14:19:07.340004 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e51d41-560a-474b-9b2e-09d2c267878c" containerName="glance-log" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.340025 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e51d41-560a-474b-9b2e-09d2c267878c" containerName="glance-log" Oct 03 14:19:07 crc kubenswrapper[4636]: E1003 14:19:07.340048 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e51d41-560a-474b-9b2e-09d2c267878c" containerName="glance-httpd" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.340057 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e51d41-560a-474b-9b2e-09d2c267878c" containerName="glance-httpd" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.340294 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e51d41-560a-474b-9b2e-09d2c267878c" containerName="glance-httpd" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.340310 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e51d41-560a-474b-9b2e-09d2c267878c" containerName="glance-log" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.341522 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.353560 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.377674 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7976d47688-kx5v5"] Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.417712 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-secret-key\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.417755 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-tls-certs\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.417788 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92ef2fa8-5e4e-49f1-8840-01b5be29d036-logs\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.422479 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-combined-ca-bundle\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.431695 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.435656 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vccvt\" (UniqueName: \"kubernetes.io/projected/92ef2fa8-5e4e-49f1-8840-01b5be29d036-kube-api-access-vccvt\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.436252 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-config-data\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.437407 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-scripts\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.495271 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c8488ff4f-4mhlb"] Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.516327 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16535f8a-4be6-4805-b69a-667593095ad0","Type":"ContainerStarted","Data":"59ce9823ac6f70dbf3bf8251786ebba92cad8730e822b8c9acdc1621e7d581ce"} Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.534556 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8c5bc9456-rfvns"] Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.536883 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.541939 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-combined-ca-bundle\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.541987 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vccvt\" (UniqueName: \"kubernetes.io/projected/92ef2fa8-5e4e-49f1-8840-01b5be29d036-kube-api-access-vccvt\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.542074 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-config-data\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.542141 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-scripts\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.542167 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-secret-key\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.542189 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-tls-certs\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.542213 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92ef2fa8-5e4e-49f1-8840-01b5be29d036-logs\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.543318 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-scripts\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.549858 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92ef2fa8-5e4e-49f1-8840-01b5be29d036-logs\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.555201 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c5bc9456-rfvns"] Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.555275 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-combined-ca-bundle\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.555282 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-secret-key\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.557448 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-config-data\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.560748 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-tls-certs\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.588773 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01e51d41-560a-474b-9b2e-09d2c267878c","Type":"ContainerDied","Data":"a27bb89cd0df39289e8d5164bca85610b34998b1e8ac07bcdd65616af630c2d5"} Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.588827 4636 scope.go:117] "RemoveContainer" containerID="8fe8625f5d4e89b4a7ac3abd92ddb51ac202b92725a54e4f5f0c5d6200af16c0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.590359 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.602202 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vccvt\" (UniqueName: \"kubernetes.io/projected/92ef2fa8-5e4e-49f1-8840-01b5be29d036-kube-api-access-vccvt\") pod \"horizon-7976d47688-kx5v5\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.643602 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0025da7c-17f3-4036-a9fc-3330508c11cd-config-data\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.643677 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0025da7c-17f3-4036-a9fc-3330508c11cd-horizon-secret-key\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.643732 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0025da7c-17f3-4036-a9fc-3330508c11cd-horizon-tls-certs\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.643771 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0025da7c-17f3-4036-a9fc-3330508c11cd-logs\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.643900 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0025da7c-17f3-4036-a9fc-3330508c11cd-combined-ca-bundle\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.643994 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0025da7c-17f3-4036-a9fc-3330508c11cd-scripts\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.644063 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrpzd\" (UniqueName: \"kubernetes.io/projected/0025da7c-17f3-4036-a9fc-3330508c11cd-kube-api-access-zrpzd\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.689764 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.691238 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.705190 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.716356 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.717990 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.727049 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.727708 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.745729 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0025da7c-17f3-4036-a9fc-3330508c11cd-scripts\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.745784 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrpzd\" (UniqueName: \"kubernetes.io/projected/0025da7c-17f3-4036-a9fc-3330508c11cd-kube-api-access-zrpzd\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.745842 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0025da7c-17f3-4036-a9fc-3330508c11cd-config-data\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.745904 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0025da7c-17f3-4036-a9fc-3330508c11cd-horizon-secret-key\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.745971 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0025da7c-17f3-4036-a9fc-3330508c11cd-horizon-tls-certs\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.745996 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0025da7c-17f3-4036-a9fc-3330508c11cd-logs\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.746020 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0025da7c-17f3-4036-a9fc-3330508c11cd-combined-ca-bundle\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.747519 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0025da7c-17f3-4036-a9fc-3330508c11cd-logs\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.753347 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0025da7c-17f3-4036-a9fc-3330508c11cd-scripts\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.757034 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0025da7c-17f3-4036-a9fc-3330508c11cd-horizon-tls-certs\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.758517 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0025da7c-17f3-4036-a9fc-3330508c11cd-config-data\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.760038 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0025da7c-17f3-4036-a9fc-3330508c11cd-combined-ca-bundle\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.760064 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0025da7c-17f3-4036-a9fc-3330508c11cd-horizon-secret-key\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.763749 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.789058 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrpzd\" (UniqueName: \"kubernetes.io/projected/0025da7c-17f3-4036-a9fc-3330508c11cd-kube-api-access-zrpzd\") pod \"horizon-8c5bc9456-rfvns\" (UID: \"0025da7c-17f3-4036-a9fc-3330508c11cd\") " pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.849051 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-logs\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.849241 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.849283 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llsnv\" (UniqueName: \"kubernetes.io/projected/8fbf639b-5c21-47d7-9596-091f6b796167-kube-api-access-llsnv\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.849343 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.849372 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.849390 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.849428 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.849445 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.950865 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.950924 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llsnv\" (UniqueName: \"kubernetes.io/projected/8fbf639b-5c21-47d7-9596-091f6b796167-kube-api-access-llsnv\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.950977 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.951008 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.951024 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.951065 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.951082 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.951154 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-logs\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.951569 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-logs\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.954579 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.957292 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.958040 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.960718 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.963518 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.964539 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.972841 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:07 crc kubenswrapper[4636]: I1003 14:19:07.979404 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:19:08 crc kubenswrapper[4636]: I1003 14:19:07.998883 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llsnv\" (UniqueName: \"kubernetes.io/projected/8fbf639b-5c21-47d7-9596-091f6b796167-kube-api-access-llsnv\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:08 crc kubenswrapper[4636]: I1003 14:19:08.008777 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:19:08 crc kubenswrapper[4636]: I1003 14:19:08.116518 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-68zzd"] Oct 03 14:19:08 crc kubenswrapper[4636]: I1003 14:19:08.121913 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:08 crc kubenswrapper[4636]: I1003 14:19:08.128703 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-68zzd" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="dnsmasq-dns" containerID="cri-o://c28b53c7e699590d302089050ae3122674a36c7fbdb9e56e50be35b6965e2212" gracePeriod=10 Oct 03 14:19:08 crc kubenswrapper[4636]: I1003 14:19:08.626317 4636 generic.go:334] "Generic (PLEG): container finished" podID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerID="c28b53c7e699590d302089050ae3122674a36c7fbdb9e56e50be35b6965e2212" exitCode=0 Oct 03 14:19:08 crc kubenswrapper[4636]: I1003 14:19:08.626675 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-68zzd" event={"ID":"3e923689-aa01-44f4-941e-56418b1c3fe5","Type":"ContainerDied","Data":"c28b53c7e699590d302089050ae3122674a36c7fbdb9e56e50be35b6965e2212"} Oct 03 14:19:08 crc kubenswrapper[4636]: I1003 14:19:08.821959 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e51d41-560a-474b-9b2e-09d2c267878c" path="/var/lib/kubelet/pods/01e51d41-560a-474b-9b2e-09d2c267878c/volumes" Oct 03 14:19:09 crc kubenswrapper[4636]: I1003 14:19:09.637203 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16535f8a-4be6-4805-b69a-667593095ad0","Type":"ContainerStarted","Data":"3ae7a32d96d80d6f23a7a881b376db604449bac5f6cc8b484efb16c7ad1e544a"} Oct 03 14:19:09 crc kubenswrapper[4636]: I1003 14:19:09.643373 4636 generic.go:334] "Generic (PLEG): container finished" podID="c7676518-1b64-43b5-85ab-44217dbdfa68" containerID="7cc5445780c2bc60b13e6c4386f884c0cc76b440675901fee80ec24a5ab19745" exitCode=0 Oct 03 14:19:09 crc kubenswrapper[4636]: I1003 14:19:09.643428 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-swxgw" event={"ID":"c7676518-1b64-43b5-85ab-44217dbdfa68","Type":"ContainerDied","Data":"7cc5445780c2bc60b13e6c4386f884c0cc76b440675901fee80ec24a5ab19745"} Oct 03 14:19:11 crc kubenswrapper[4636]: I1003 14:19:11.806471 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-68zzd" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Oct 03 14:19:12 crc kubenswrapper[4636]: I1003 14:19:12.937629 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.093672 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-config-data\") pod \"c7676518-1b64-43b5-85ab-44217dbdfa68\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.093718 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-combined-ca-bundle\") pod \"c7676518-1b64-43b5-85ab-44217dbdfa68\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.093771 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwsms\" (UniqueName: \"kubernetes.io/projected/c7676518-1b64-43b5-85ab-44217dbdfa68-kube-api-access-xwsms\") pod \"c7676518-1b64-43b5-85ab-44217dbdfa68\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.093877 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-fernet-keys\") pod \"c7676518-1b64-43b5-85ab-44217dbdfa68\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.093914 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-credential-keys\") pod \"c7676518-1b64-43b5-85ab-44217dbdfa68\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.093972 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-scripts\") pod \"c7676518-1b64-43b5-85ab-44217dbdfa68\" (UID: \"c7676518-1b64-43b5-85ab-44217dbdfa68\") " Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.098385 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c7676518-1b64-43b5-85ab-44217dbdfa68" (UID: "c7676518-1b64-43b5-85ab-44217dbdfa68"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.099193 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-scripts" (OuterVolumeSpecName: "scripts") pod "c7676518-1b64-43b5-85ab-44217dbdfa68" (UID: "c7676518-1b64-43b5-85ab-44217dbdfa68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.107673 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7676518-1b64-43b5-85ab-44217dbdfa68-kube-api-access-xwsms" (OuterVolumeSpecName: "kube-api-access-xwsms") pod "c7676518-1b64-43b5-85ab-44217dbdfa68" (UID: "c7676518-1b64-43b5-85ab-44217dbdfa68"). InnerVolumeSpecName "kube-api-access-xwsms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.113429 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c7676518-1b64-43b5-85ab-44217dbdfa68" (UID: "c7676518-1b64-43b5-85ab-44217dbdfa68"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.137003 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7676518-1b64-43b5-85ab-44217dbdfa68" (UID: "c7676518-1b64-43b5-85ab-44217dbdfa68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.144558 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-config-data" (OuterVolumeSpecName: "config-data") pod "c7676518-1b64-43b5-85ab-44217dbdfa68" (UID: "c7676518-1b64-43b5-85ab-44217dbdfa68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.196835 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.196869 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.196880 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwsms\" (UniqueName: \"kubernetes.io/projected/c7676518-1b64-43b5-85ab-44217dbdfa68-kube-api-access-xwsms\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.196891 4636 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.196899 4636 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.196907 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7676518-1b64-43b5-85ab-44217dbdfa68-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.696914 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-swxgw" event={"ID":"c7676518-1b64-43b5-85ab-44217dbdfa68","Type":"ContainerDied","Data":"d0219683a17a60be8865249bcac2cf04d5e32064127434dde818d67be23ef6cc"} Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.697221 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0219683a17a60be8865249bcac2cf04d5e32064127434dde818d67be23ef6cc" Oct 03 14:19:13 crc kubenswrapper[4636]: I1003 14:19:13.696954 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-swxgw" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.043619 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-swxgw"] Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.054721 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-swxgw"] Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.127585 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-254t7"] Oct 03 14:19:14 crc kubenswrapper[4636]: E1003 14:19:14.128153 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7676518-1b64-43b5-85ab-44217dbdfa68" containerName="keystone-bootstrap" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.128178 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7676518-1b64-43b5-85ab-44217dbdfa68" containerName="keystone-bootstrap" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.128382 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7676518-1b64-43b5-85ab-44217dbdfa68" containerName="keystone-bootstrap" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.129113 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.134330 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.134446 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.135742 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l2krf" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.135882 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.140054 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-254t7"] Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.220071 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-combined-ca-bundle\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.220161 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-config-data\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.220185 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-credential-keys\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.220207 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-scripts\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.220257 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cngb\" (UniqueName: \"kubernetes.io/projected/9e686877-1f2c-4049-8f72-2788c4ff74b8-kube-api-access-9cngb\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.220364 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-fernet-keys\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.321778 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-config-data\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.321830 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-credential-keys\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.321856 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-scripts\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.321938 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cngb\" (UniqueName: \"kubernetes.io/projected/9e686877-1f2c-4049-8f72-2788c4ff74b8-kube-api-access-9cngb\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.322005 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-fernet-keys\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.322050 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-combined-ca-bundle\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.326685 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-combined-ca-bundle\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.326980 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-config-data\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.337293 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-scripts\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.338091 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-fernet-keys\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.350452 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cngb\" (UniqueName: \"kubernetes.io/projected/9e686877-1f2c-4049-8f72-2788c4ff74b8-kube-api-access-9cngb\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.355774 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-credential-keys\") pod \"keystone-bootstrap-254t7\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.463654 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:14 crc kubenswrapper[4636]: I1003 14:19:14.807032 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7676518-1b64-43b5-85ab-44217dbdfa68" path="/var/lib/kubelet/pods/c7676518-1b64-43b5-85ab-44217dbdfa68/volumes" Oct 03 14:19:16 crc kubenswrapper[4636]: I1003 14:19:16.805915 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-68zzd" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Oct 03 14:19:26 crc kubenswrapper[4636]: I1003 14:19:26.806432 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-68zzd" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Oct 03 14:19:26 crc kubenswrapper[4636]: I1003 14:19:26.809361 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:19:28 crc kubenswrapper[4636]: E1003 14:19:28.041500 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 03 14:19:28 crc kubenswrapper[4636]: E1003 14:19:28.041698 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68dh579h6bh5f8h657hc6h58dh696hcdh5f8hd6h68h58bh566h57ch57bhd5h5cdh5fdh565h688h7bh95h665h68bh66h5c7hb6h5bfhb4h55fh577q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t6phx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-64994b5457-95phr_openstack(a38ad103-c6f8-4570-80f0-c9e7d3577588): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:19:28 crc kubenswrapper[4636]: E1003 14:19:28.058403 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-64994b5457-95phr" podUID="a38ad103-c6f8-4570-80f0-c9e7d3577588" Oct 03 14:19:28 crc kubenswrapper[4636]: E1003 14:19:28.085863 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 03 14:19:28 crc kubenswrapper[4636]: E1003 14:19:28.086073 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b6h68bh566h579h5c8h5dbh587hffh699h86h58ch557h566h654h55ch679h6dhc7h7dhb5h5c7hf8hffh64h66ch68h5h657h5d7h57dh578h566q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qxgvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6cc7cf4789-zfvq9_openstack(6b239dc4-c3c4-4d31-b23f-83a48fc0d72f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:19:28 crc kubenswrapper[4636]: E1003 14:19:28.088592 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6cc7cf4789-zfvq9" podUID="6b239dc4-c3c4-4d31-b23f-83a48fc0d72f" Oct 03 14:19:30 crc kubenswrapper[4636]: E1003 14:19:30.192700 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 03 14:19:30 crc kubenswrapper[4636]: E1003 14:19:30.193445 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n545hffh668h544h7ch57h56bhf8h77h544h5dfh74h67bh5b7h595h66h69h54dh9h598h5b5hdfh68dh696h567h65dh55ch559h665h698h65h5c9q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mwd7z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5c8488ff4f-4mhlb_openstack(4f783ef7-df5d-40c8-968d-f965ec7c78e6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:19:30 crc kubenswrapper[4636]: E1003 14:19:30.195795 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5c8488ff4f-4mhlb" podUID="4f783ef7-df5d-40c8-968d-f965ec7c78e6" Oct 03 14:19:30 crc kubenswrapper[4636]: E1003 14:19:30.697507 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Oct 03 14:19:30 crc kubenswrapper[4636]: E1003 14:19:30.697981 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n664h5ffh54h54dh6dh57dh5dbh55bh64fhd8hc9h5f6hc7h686h659h5d5hb4hcfh96h67h575h55h688hcdh6hc6h64ch5b7h5cdh5bh564h54cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l5pgm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6ba9d4a1-300f-4367-ba2e-528ed4635dfd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:19:31 crc kubenswrapper[4636]: I1003 14:19:31.003902 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c5bc9456-rfvns"] Oct 03 14:19:31 crc kubenswrapper[4636]: I1003 14:19:31.808681 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-68zzd" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Oct 03 14:19:36 crc kubenswrapper[4636]: I1003 14:19:36.811187 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-68zzd" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.162609 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.162929 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.433841 4636 scope.go:117] "RemoveContainer" containerID="1ec99c0577d80ea99a219b6f5ef3668258f693c6e76c3d4228b3f06133b372ba" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.483009 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.488909 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.497716 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-nb\") pod \"3e923689-aa01-44f4-941e-56418b1c3fe5\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.497767 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-dns-svc\") pod \"3e923689-aa01-44f4-941e-56418b1c3fe5\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.497818 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-logs\") pod \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.497841 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-config\") pod \"3e923689-aa01-44f4-941e-56418b1c3fe5\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.497879 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-horizon-secret-key\") pod \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.497911 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxgvs\" (UniqueName: \"kubernetes.io/projected/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-kube-api-access-qxgvs\") pod \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.497940 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktps9\" (UniqueName: \"kubernetes.io/projected/3e923689-aa01-44f4-941e-56418b1c3fe5-kube-api-access-ktps9\") pod \"3e923689-aa01-44f4-941e-56418b1c3fe5\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.498005 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-scripts\") pod \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.498034 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-config-data\") pod \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\" (UID: \"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.498076 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-sb\") pod \"3e923689-aa01-44f4-941e-56418b1c3fe5\" (UID: \"3e923689-aa01-44f4-941e-56418b1c3fe5\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.499818 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-scripts" (OuterVolumeSpecName: "scripts") pod "6b239dc4-c3c4-4d31-b23f-83a48fc0d72f" (UID: "6b239dc4-c3c4-4d31-b23f-83a48fc0d72f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.500030 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-config-data" (OuterVolumeSpecName: "config-data") pod "6b239dc4-c3c4-4d31-b23f-83a48fc0d72f" (UID: "6b239dc4-c3c4-4d31-b23f-83a48fc0d72f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.500208 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64994b5457-95phr" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.501263 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-logs" (OuterVolumeSpecName: "logs") pod "6b239dc4-c3c4-4d31-b23f-83a48fc0d72f" (UID: "6b239dc4-c3c4-4d31-b23f-83a48fc0d72f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.517061 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-kube-api-access-qxgvs" (OuterVolumeSpecName: "kube-api-access-qxgvs") pod "6b239dc4-c3c4-4d31-b23f-83a48fc0d72f" (UID: "6b239dc4-c3c4-4d31-b23f-83a48fc0d72f"). InnerVolumeSpecName "kube-api-access-qxgvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.524444 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e923689-aa01-44f4-941e-56418b1c3fe5-kube-api-access-ktps9" (OuterVolumeSpecName: "kube-api-access-ktps9") pod "3e923689-aa01-44f4-941e-56418b1c3fe5" (UID: "3e923689-aa01-44f4-941e-56418b1c3fe5"). InnerVolumeSpecName "kube-api-access-ktps9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.559252 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6b239dc4-c3c4-4d31-b23f-83a48fc0d72f" (UID: "6b239dc4-c3c4-4d31-b23f-83a48fc0d72f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.601835 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a38ad103-c6f8-4570-80f0-c9e7d3577588-logs\") pod \"a38ad103-c6f8-4570-80f0-c9e7d3577588\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.602073 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6phx\" (UniqueName: \"kubernetes.io/projected/a38ad103-c6f8-4570-80f0-c9e7d3577588-kube-api-access-t6phx\") pod \"a38ad103-c6f8-4570-80f0-c9e7d3577588\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.602993 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-config-data\") pod \"a38ad103-c6f8-4570-80f0-c9e7d3577588\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.603313 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-scripts\") pod \"a38ad103-c6f8-4570-80f0-c9e7d3577588\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.603368 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a38ad103-c6f8-4570-80f0-c9e7d3577588-horizon-secret-key\") pod \"a38ad103-c6f8-4570-80f0-c9e7d3577588\" (UID: \"a38ad103-c6f8-4570-80f0-c9e7d3577588\") " Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.604428 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-scripts" (OuterVolumeSpecName: "scripts") pod "a38ad103-c6f8-4570-80f0-c9e7d3577588" (UID: "a38ad103-c6f8-4570-80f0-c9e7d3577588"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.604640 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-config-data" (OuterVolumeSpecName: "config-data") pod "a38ad103-c6f8-4570-80f0-c9e7d3577588" (UID: "a38ad103-c6f8-4570-80f0-c9e7d3577588"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.605187 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a38ad103-c6f8-4570-80f0-c9e7d3577588-logs" (OuterVolumeSpecName: "logs") pod "a38ad103-c6f8-4570-80f0-c9e7d3577588" (UID: "a38ad103-c6f8-4570-80f0-c9e7d3577588"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.607893 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.607921 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.607932 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a38ad103-c6f8-4570-80f0-c9e7d3577588-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.607940 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.607948 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.607956 4636 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.607965 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxgvs\" (UniqueName: \"kubernetes.io/projected/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f-kube-api-access-qxgvs\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.607977 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktps9\" (UniqueName: \"kubernetes.io/projected/3e923689-aa01-44f4-941e-56418b1c3fe5-kube-api-access-ktps9\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.607986 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a38ad103-c6f8-4570-80f0-c9e7d3577588-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.616343 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a38ad103-c6f8-4570-80f0-c9e7d3577588-kube-api-access-t6phx" (OuterVolumeSpecName: "kube-api-access-t6phx") pod "a38ad103-c6f8-4570-80f0-c9e7d3577588" (UID: "a38ad103-c6f8-4570-80f0-c9e7d3577588"). InnerVolumeSpecName "kube-api-access-t6phx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.618742 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-config" (OuterVolumeSpecName: "config") pod "3e923689-aa01-44f4-941e-56418b1c3fe5" (UID: "3e923689-aa01-44f4-941e-56418b1c3fe5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.619242 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a38ad103-c6f8-4570-80f0-c9e7d3577588-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a38ad103-c6f8-4570-80f0-c9e7d3577588" (UID: "a38ad103-c6f8-4570-80f0-c9e7d3577588"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.624149 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3e923689-aa01-44f4-941e-56418b1c3fe5" (UID: "3e923689-aa01-44f4-941e-56418b1c3fe5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.627275 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e923689-aa01-44f4-941e-56418b1c3fe5" (UID: "3e923689-aa01-44f4-941e-56418b1c3fe5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.683664 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3e923689-aa01-44f4-941e-56418b1c3fe5" (UID: "3e923689-aa01-44f4-941e-56418b1c3fe5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.709295 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.709330 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6phx\" (UniqueName: \"kubernetes.io/projected/a38ad103-c6f8-4570-80f0-c9e7d3577588-kube-api-access-t6phx\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.709343 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.709353 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.709361 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e923689-aa01-44f4-941e-56418b1c3fe5-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.709369 4636 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a38ad103-c6f8-4570-80f0-c9e7d3577588-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.908882 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cc7cf4789-zfvq9" event={"ID":"6b239dc4-c3c4-4d31-b23f-83a48fc0d72f","Type":"ContainerDied","Data":"e14bd899f89327f91a219a7032b64e0cffef887cece58496c2906ccfe7abb3f7"} Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.908900 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cc7cf4789-zfvq9" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.911750 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-68zzd" event={"ID":"3e923689-aa01-44f4-941e-56418b1c3fe5","Type":"ContainerDied","Data":"8f7d6643cd2557bbe5430c8ea00836d9746c6aa9a2cc16b2f7e320f628c86502"} Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.911808 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-68zzd" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.914404 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64994b5457-95phr" event={"ID":"a38ad103-c6f8-4570-80f0-c9e7d3577588","Type":"ContainerDied","Data":"c8219238ae112e4ff7e6da49a2e734d841a2b987526bd0f7baa9bee5069dc02b"} Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.914548 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64994b5457-95phr" Oct 03 14:19:39 crc kubenswrapper[4636]: I1003 14:19:39.984910 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cc7cf4789-zfvq9"] Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.000608 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6cc7cf4789-zfvq9"] Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.017463 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64994b5457-95phr"] Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.028469 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64994b5457-95phr"] Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.035127 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-68zzd"] Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.048869 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-68zzd"] Oct 03 14:19:40 crc kubenswrapper[4636]: E1003 14:19:40.167353 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 03 14:19:40 crc kubenswrapper[4636]: E1003 14:19:40.167726 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2h94q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-5dc9p_openstack(0eb04b62-9d0b-4dda-aff1-022bed4af5b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:19:40 crc kubenswrapper[4636]: E1003 14:19:40.168867 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-5dc9p" podUID="0eb04b62-9d0b-4dda-aff1-022bed4af5b4" Oct 03 14:19:40 crc kubenswrapper[4636]: W1003 14:19:40.179518 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0025da7c_17f3_4036_a9fc_3330508c11cd.slice/crio-b89e72303cfcfcd94aed9b187cbea69cbe63a564c4718ae9331d601469822aca WatchSource:0}: Error finding container b89e72303cfcfcd94aed9b187cbea69cbe63a564c4718ae9331d601469822aca: Status 404 returned error can't find the container with id b89e72303cfcfcd94aed9b187cbea69cbe63a564c4718ae9331d601469822aca Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.183977 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.219753 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f783ef7-df5d-40c8-968d-f965ec7c78e6-logs\") pod \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.219896 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwd7z\" (UniqueName: \"kubernetes.io/projected/4f783ef7-df5d-40c8-968d-f965ec7c78e6-kube-api-access-mwd7z\") pod \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.219925 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f783ef7-df5d-40c8-968d-f965ec7c78e6-horizon-secret-key\") pod \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.219968 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-config-data\") pod \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.220056 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-scripts\") pod \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\" (UID: \"4f783ef7-df5d-40c8-968d-f965ec7c78e6\") " Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.220082 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f783ef7-df5d-40c8-968d-f965ec7c78e6-logs" (OuterVolumeSpecName: "logs") pod "4f783ef7-df5d-40c8-968d-f965ec7c78e6" (UID: "4f783ef7-df5d-40c8-968d-f965ec7c78e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.220851 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-scripts" (OuterVolumeSpecName: "scripts") pod "4f783ef7-df5d-40c8-968d-f965ec7c78e6" (UID: "4f783ef7-df5d-40c8-968d-f965ec7c78e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.220962 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-config-data" (OuterVolumeSpecName: "config-data") pod "4f783ef7-df5d-40c8-968d-f965ec7c78e6" (UID: "4f783ef7-df5d-40c8-968d-f965ec7c78e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.221123 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f783ef7-df5d-40c8-968d-f965ec7c78e6-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.221142 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.224306 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f783ef7-df5d-40c8-968d-f965ec7c78e6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4f783ef7-df5d-40c8-968d-f965ec7c78e6" (UID: "4f783ef7-df5d-40c8-968d-f965ec7c78e6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.225076 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f783ef7-df5d-40c8-968d-f965ec7c78e6-kube-api-access-mwd7z" (OuterVolumeSpecName: "kube-api-access-mwd7z") pod "4f783ef7-df5d-40c8-968d-f965ec7c78e6" (UID: "4f783ef7-df5d-40c8-968d-f965ec7c78e6"). InnerVolumeSpecName "kube-api-access-mwd7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.323225 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwd7z\" (UniqueName: \"kubernetes.io/projected/4f783ef7-df5d-40c8-968d-f965ec7c78e6-kube-api-access-mwd7z\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.323558 4636 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f783ef7-df5d-40c8-968d-f965ec7c78e6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.323571 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f783ef7-df5d-40c8-968d-f965ec7c78e6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.704868 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.805563 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" path="/var/lib/kubelet/pods/3e923689-aa01-44f4-941e-56418b1c3fe5/volumes" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.806659 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b239dc4-c3c4-4d31-b23f-83a48fc0d72f" path="/var/lib/kubelet/pods/6b239dc4-c3c4-4d31-b23f-83a48fc0d72f/volumes" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.807073 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a38ad103-c6f8-4570-80f0-c9e7d3577588" path="/var/lib/kubelet/pods/a38ad103-c6f8-4570-80f0-c9e7d3577588/volumes" Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.923400 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c5bc9456-rfvns" event={"ID":"0025da7c-17f3-4036-a9fc-3330508c11cd","Type":"ContainerStarted","Data":"b89e72303cfcfcd94aed9b187cbea69cbe63a564c4718ae9331d601469822aca"} Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.925125 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c8488ff4f-4mhlb" event={"ID":"4f783ef7-df5d-40c8-968d-f965ec7c78e6","Type":"ContainerDied","Data":"5e8e72331e3cad531e1c3c5eaba36d23ead8f4d9551a3bc6ecfd857067dff972"} Oct 03 14:19:40 crc kubenswrapper[4636]: I1003 14:19:40.925187 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c8488ff4f-4mhlb" Oct 03 14:19:40 crc kubenswrapper[4636]: E1003 14:19:40.926960 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-5dc9p" podUID="0eb04b62-9d0b-4dda-aff1-022bed4af5b4" Oct 03 14:19:41 crc kubenswrapper[4636]: I1003 14:19:41.007324 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c8488ff4f-4mhlb"] Oct 03 14:19:41 crc kubenswrapper[4636]: I1003 14:19:41.012902 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c8488ff4f-4mhlb"] Oct 03 14:19:41 crc kubenswrapper[4636]: I1003 14:19:41.816218 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-68zzd" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Oct 03 14:19:42 crc kubenswrapper[4636]: I1003 14:19:42.803890 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f783ef7-df5d-40c8-968d-f965ec7c78e6" path="/var/lib/kubelet/pods/4f783ef7-df5d-40c8-968d-f965ec7c78e6/volumes" Oct 03 14:19:43 crc kubenswrapper[4636]: W1003 14:19:43.107511 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fbf639b_5c21_47d7_9596_091f6b796167.slice/crio-149adb4504287664c60ae6f29eb7c4cf06b834e2f0815c9e5d62066d3b9ae392 WatchSource:0}: Error finding container 149adb4504287664c60ae6f29eb7c4cf06b834e2f0815c9e5d62066d3b9ae392: Status 404 returned error can't find the container with id 149adb4504287664c60ae6f29eb7c4cf06b834e2f0815c9e5d62066d3b9ae392 Oct 03 14:19:43 crc kubenswrapper[4636]: I1003 14:19:43.128365 4636 scope.go:117] "RemoveContainer" containerID="c28b53c7e699590d302089050ae3122674a36c7fbdb9e56e50be35b6965e2212" Oct 03 14:19:43 crc kubenswrapper[4636]: E1003 14:19:43.235460 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 03 14:19:43 crc kubenswrapper[4636]: E1003 14:19:43.235618 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-96vlm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-zgg4h_openstack(48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:19:43 crc kubenswrapper[4636]: E1003 14:19:43.237387 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-zgg4h" podUID="48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" Oct 03 14:19:43 crc kubenswrapper[4636]: I1003 14:19:43.597408 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7976d47688-kx5v5"] Oct 03 14:19:43 crc kubenswrapper[4636]: I1003 14:19:43.697246 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-254t7"] Oct 03 14:19:43 crc kubenswrapper[4636]: I1003 14:19:43.952776 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8fbf639b-5c21-47d7-9596-091f6b796167","Type":"ContainerStarted","Data":"149adb4504287664c60ae6f29eb7c4cf06b834e2f0815c9e5d62066d3b9ae392"} Oct 03 14:19:43 crc kubenswrapper[4636]: I1003 14:19:43.954309 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16535f8a-4be6-4805-b69a-667593095ad0","Type":"ContainerStarted","Data":"9ad45015fc27064a0fea5b605546900b0cefbb905c30f91a6b3c9e7d96bd9b55"} Oct 03 14:19:43 crc kubenswrapper[4636]: I1003 14:19:43.954467 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="16535f8a-4be6-4805-b69a-667593095ad0" containerName="glance-log" containerID="cri-o://3ae7a32d96d80d6f23a7a881b376db604449bac5f6cc8b484efb16c7ad1e544a" gracePeriod=30 Oct 03 14:19:43 crc kubenswrapper[4636]: I1003 14:19:43.954501 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="16535f8a-4be6-4805-b69a-667593095ad0" containerName="glance-httpd" containerID="cri-o://9ad45015fc27064a0fea5b605546900b0cefbb905c30f91a6b3c9e7d96bd9b55" gracePeriod=30 Oct 03 14:19:43 crc kubenswrapper[4636]: I1003 14:19:43.986718 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=38.986696065 podStartE2EDuration="38.986696065s" podCreationTimestamp="2025-10-03 14:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:43.981910281 +0000 UTC m=+1133.840636528" watchObservedRunningTime="2025-10-03 14:19:43.986696065 +0000 UTC m=+1133.845422322" Oct 03 14:19:44 crc kubenswrapper[4636]: E1003 14:19:44.060341 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-zgg4h" podUID="48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" Oct 03 14:19:44 crc kubenswrapper[4636]: I1003 14:19:44.259246 4636 scope.go:117] "RemoveContainer" containerID="c2f9e50356dae7c0b3c9bc71ae0827e7ad93c2b989d2c9ef131f81aced467fea" Oct 03 14:19:44 crc kubenswrapper[4636]: I1003 14:19:44.967490 4636 generic.go:334] "Generic (PLEG): container finished" podID="16535f8a-4be6-4805-b69a-667593095ad0" containerID="9ad45015fc27064a0fea5b605546900b0cefbb905c30f91a6b3c9e7d96bd9b55" exitCode=143 Oct 03 14:19:44 crc kubenswrapper[4636]: I1003 14:19:44.967920 4636 generic.go:334] "Generic (PLEG): container finished" podID="16535f8a-4be6-4805-b69a-667593095ad0" containerID="3ae7a32d96d80d6f23a7a881b376db604449bac5f6cc8b484efb16c7ad1e544a" exitCode=143 Oct 03 14:19:44 crc kubenswrapper[4636]: I1003 14:19:44.968030 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16535f8a-4be6-4805-b69a-667593095ad0","Type":"ContainerDied","Data":"9ad45015fc27064a0fea5b605546900b0cefbb905c30f91a6b3c9e7d96bd9b55"} Oct 03 14:19:44 crc kubenswrapper[4636]: I1003 14:19:44.968063 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16535f8a-4be6-4805-b69a-667593095ad0","Type":"ContainerDied","Data":"3ae7a32d96d80d6f23a7a881b376db604449bac5f6cc8b484efb16c7ad1e544a"} Oct 03 14:19:44 crc kubenswrapper[4636]: I1003 14:19:44.973984 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8fbf639b-5c21-47d7-9596-091f6b796167","Type":"ContainerStarted","Data":"07220c4837c958438f7d722ed3c4b264b8bc20c858517d2c129d21ba2de945a8"} Oct 03 14:19:44 crc kubenswrapper[4636]: I1003 14:19:44.977008 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-254t7" event={"ID":"9e686877-1f2c-4049-8f72-2788c4ff74b8","Type":"ContainerStarted","Data":"f46a0f53bb2bbc683d0e74f4ae2fc16c06ca7b7ca12927cfe1cd9d0d8cf74d49"} Oct 03 14:19:44 crc kubenswrapper[4636]: I1003 14:19:44.979253 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976d47688-kx5v5" event={"ID":"92ef2fa8-5e4e-49f1-8840-01b5be29d036","Type":"ContainerStarted","Data":"9f3ab565914dc40550b11d09227c0d9ef5e347c0c2ce97b84ce364dc428f4293"} Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.019928 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.109396 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-logs\") pod \"16535f8a-4be6-4805-b69a-667593095ad0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.109454 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-scripts\") pod \"16535f8a-4be6-4805-b69a-667593095ad0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.109575 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8dtq\" (UniqueName: \"kubernetes.io/projected/16535f8a-4be6-4805-b69a-667593095ad0-kube-api-access-x8dtq\") pod \"16535f8a-4be6-4805-b69a-667593095ad0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.109601 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-config-data\") pod \"16535f8a-4be6-4805-b69a-667593095ad0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.109638 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-combined-ca-bundle\") pod \"16535f8a-4be6-4805-b69a-667593095ad0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.109677 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-httpd-run\") pod \"16535f8a-4be6-4805-b69a-667593095ad0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.109713 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"16535f8a-4be6-4805-b69a-667593095ad0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.109744 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-public-tls-certs\") pod \"16535f8a-4be6-4805-b69a-667593095ad0\" (UID: \"16535f8a-4be6-4805-b69a-667593095ad0\") " Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.110475 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16535f8a-4be6-4805-b69a-667593095ad0" (UID: "16535f8a-4be6-4805-b69a-667593095ad0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.111477 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-logs" (OuterVolumeSpecName: "logs") pod "16535f8a-4be6-4805-b69a-667593095ad0" (UID: "16535f8a-4be6-4805-b69a-667593095ad0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.113123 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16535f8a-4be6-4805-b69a-667593095ad0-kube-api-access-x8dtq" (OuterVolumeSpecName: "kube-api-access-x8dtq") pod "16535f8a-4be6-4805-b69a-667593095ad0" (UID: "16535f8a-4be6-4805-b69a-667593095ad0"). InnerVolumeSpecName "kube-api-access-x8dtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.115404 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-scripts" (OuterVolumeSpecName: "scripts") pod "16535f8a-4be6-4805-b69a-667593095ad0" (UID: "16535f8a-4be6-4805-b69a-667593095ad0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.115560 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "16535f8a-4be6-4805-b69a-667593095ad0" (UID: "16535f8a-4be6-4805-b69a-667593095ad0"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.141804 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16535f8a-4be6-4805-b69a-667593095ad0" (UID: "16535f8a-4be6-4805-b69a-667593095ad0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.159415 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "16535f8a-4be6-4805-b69a-667593095ad0" (UID: "16535f8a-4be6-4805-b69a-667593095ad0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.192800 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-config-data" (OuterVolumeSpecName: "config-data") pod "16535f8a-4be6-4805-b69a-667593095ad0" (UID: "16535f8a-4be6-4805-b69a-667593095ad0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.215952 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8dtq\" (UniqueName: \"kubernetes.io/projected/16535f8a-4be6-4805-b69a-667593095ad0-kube-api-access-x8dtq\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.215980 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.215994 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.216013 4636 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.216043 4636 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.216054 4636 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.216066 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16535f8a-4be6-4805-b69a-667593095ad0-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.216077 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16535f8a-4be6-4805-b69a-667593095ad0-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.240089 4636 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 03 14:19:45 crc kubenswrapper[4636]: I1003 14:19:45.317650 4636 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.006461 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-254t7" event={"ID":"9e686877-1f2c-4049-8f72-2788c4ff74b8","Type":"ContainerStarted","Data":"a895fd68ed7ee228433cf4e51a3f47a322af9549d5459155ccc3bc9373dd2cd5"} Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.011822 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8fbf639b-5c21-47d7-9596-091f6b796167","Type":"ContainerStarted","Data":"90137aba5d1d67c92f563994412f3500af0959d309d1ae4e0938c4819741745c"} Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.018834 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ba9d4a1-300f-4367-ba2e-528ed4635dfd","Type":"ContainerStarted","Data":"6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2"} Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.024010 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c5bc9456-rfvns" event={"ID":"0025da7c-17f3-4036-a9fc-3330508c11cd","Type":"ContainerStarted","Data":"06f679dbd2162a16446c862c381a58ad1d1e1b42d65691283d13ea935216cc4a"} Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.029524 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hv2wz" event={"ID":"02e452cc-6659-4abd-88ff-d9e731b9b1ef","Type":"ContainerStarted","Data":"e186ebaf19345cada0ff2fcd09f9e68a719a10444ee9ec1f7097c7a5988eb3eb"} Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.032938 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-254t7" podStartSLOduration=32.032917411 podStartE2EDuration="32.032917411s" podCreationTimestamp="2025-10-03 14:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:46.030529859 +0000 UTC m=+1135.889256106" watchObservedRunningTime="2025-10-03 14:19:46.032917411 +0000 UTC m=+1135.891643658" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.040466 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16535f8a-4be6-4805-b69a-667593095ad0","Type":"ContainerDied","Data":"59ce9823ac6f70dbf3bf8251786ebba92cad8730e822b8c9acdc1621e7d581ce"} Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.040508 4636 scope.go:117] "RemoveContainer" containerID="9ad45015fc27064a0fea5b605546900b0cefbb905c30f91a6b3c9e7d96bd9b55" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.040605 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.084312 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=39.084271745 podStartE2EDuration="39.084271745s" podCreationTimestamp="2025-10-03 14:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:46.060255861 +0000 UTC m=+1135.918982128" watchObservedRunningTime="2025-10-03 14:19:46.084271745 +0000 UTC m=+1135.942997992" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.100768 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-hv2wz" podStartSLOduration=5.662097887 podStartE2EDuration="49.100745673s" podCreationTimestamp="2025-10-03 14:18:57 +0000 UTC" firstStartedPulling="2025-10-03 14:18:59.70963959 +0000 UTC m=+1089.568365837" lastFinishedPulling="2025-10-03 14:19:43.148287376 +0000 UTC m=+1133.007013623" observedRunningTime="2025-10-03 14:19:46.084403248 +0000 UTC m=+1135.943129495" watchObservedRunningTime="2025-10-03 14:19:46.100745673 +0000 UTC m=+1135.959471920" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.113259 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.122918 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.124290 4636 scope.go:117] "RemoveContainer" containerID="3ae7a32d96d80d6f23a7a881b376db604449bac5f6cc8b484efb16c7ad1e544a" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.159271 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:46 crc kubenswrapper[4636]: E1003 14:19:46.159685 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="init" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.159706 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="init" Oct 03 14:19:46 crc kubenswrapper[4636]: E1003 14:19:46.159739 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16535f8a-4be6-4805-b69a-667593095ad0" containerName="glance-log" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.159748 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="16535f8a-4be6-4805-b69a-667593095ad0" containerName="glance-log" Oct 03 14:19:46 crc kubenswrapper[4636]: E1003 14:19:46.159770 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="dnsmasq-dns" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.159779 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="dnsmasq-dns" Oct 03 14:19:46 crc kubenswrapper[4636]: E1003 14:19:46.159790 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16535f8a-4be6-4805-b69a-667593095ad0" containerName="glance-httpd" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.159796 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="16535f8a-4be6-4805-b69a-667593095ad0" containerName="glance-httpd" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.159945 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="16535f8a-4be6-4805-b69a-667593095ad0" containerName="glance-httpd" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.159957 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e923689-aa01-44f4-941e-56418b1c3fe5" containerName="dnsmasq-dns" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.159971 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="16535f8a-4be6-4805-b69a-667593095ad0" containerName="glance-log" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.160868 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.220263 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.220490 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.225079 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.349926 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.350037 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs2hb\" (UniqueName: \"kubernetes.io/projected/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-kube-api-access-gs2hb\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.350149 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-logs\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.350179 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.350227 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.350257 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.350299 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.350329 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.451897 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.451965 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.451995 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.452027 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.452055 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.452083 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.452204 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2hb\" (UniqueName: \"kubernetes.io/projected/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-kube-api-access-gs2hb\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.452326 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-logs\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.452779 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-logs\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.453585 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.455812 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.460028 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.460204 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.463993 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.475358 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs2hb\" (UniqueName: \"kubernetes.io/projected/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-kube-api-access-gs2hb\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.477868 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.517207 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.558896 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:19:46 crc kubenswrapper[4636]: I1003 14:19:46.810640 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16535f8a-4be6-4805-b69a-667593095ad0" path="/var/lib/kubelet/pods/16535f8a-4be6-4805-b69a-667593095ad0/volumes" Oct 03 14:19:47 crc kubenswrapper[4636]: I1003 14:19:47.069524 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c5bc9456-rfvns" event={"ID":"0025da7c-17f3-4036-a9fc-3330508c11cd","Type":"ContainerStarted","Data":"79ec2b52d512bd5fdecc8099ed615849a5e48db694fa62ea57cc7a7daeb17a1c"} Oct 03 14:19:47 crc kubenswrapper[4636]: I1003 14:19:47.078415 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976d47688-kx5v5" event={"ID":"92ef2fa8-5e4e-49f1-8840-01b5be29d036","Type":"ContainerStarted","Data":"6f203755d3b7d2412b112d16f7778187f0cdb274206e2b3e4aaeccb274cae768"} Oct 03 14:19:47 crc kubenswrapper[4636]: I1003 14:19:47.078467 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976d47688-kx5v5" event={"ID":"92ef2fa8-5e4e-49f1-8840-01b5be29d036","Type":"ContainerStarted","Data":"f07e34e6b6d7315b30759822fbb4041e8a844861251cba7541a6632092e00e7f"} Oct 03 14:19:47 crc kubenswrapper[4636]: I1003 14:19:47.125207 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8c5bc9456-rfvns" podStartSLOduration=35.348263382 podStartE2EDuration="40.125190314s" podCreationTimestamp="2025-10-03 14:19:07 +0000 UTC" firstStartedPulling="2025-10-03 14:19:40.183030276 +0000 UTC m=+1130.041756523" lastFinishedPulling="2025-10-03 14:19:44.959957208 +0000 UTC m=+1134.818683455" observedRunningTime="2025-10-03 14:19:47.123364347 +0000 UTC m=+1136.982090594" watchObservedRunningTime="2025-10-03 14:19:47.125190314 +0000 UTC m=+1136.983916551" Oct 03 14:19:47 crc kubenswrapper[4636]: I1003 14:19:47.284469 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7976d47688-kx5v5" podStartSLOduration=38.623256807 podStartE2EDuration="40.284443581s" podCreationTimestamp="2025-10-03 14:19:07 +0000 UTC" firstStartedPulling="2025-10-03 14:19:44.080319547 +0000 UTC m=+1133.939045794" lastFinishedPulling="2025-10-03 14:19:45.741506321 +0000 UTC m=+1135.600232568" observedRunningTime="2025-10-03 14:19:47.175885041 +0000 UTC m=+1137.034611288" watchObservedRunningTime="2025-10-03 14:19:47.284443581 +0000 UTC m=+1137.143169838" Oct 03 14:19:47 crc kubenswrapper[4636]: I1003 14:19:47.290265 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:19:47 crc kubenswrapper[4636]: W1003 14:19:47.327324 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdd6b4ee_372a_42a0_a353_b3a82463d3ff.slice/crio-68d2c7efcfa8d8d9cdbf850cb801611634bb7eff15d0eba7ccfa1d44eb43db54 WatchSource:0}: Error finding container 68d2c7efcfa8d8d9cdbf850cb801611634bb7eff15d0eba7ccfa1d44eb43db54: Status 404 returned error can't find the container with id 68d2c7efcfa8d8d9cdbf850cb801611634bb7eff15d0eba7ccfa1d44eb43db54 Oct 03 14:19:47 crc kubenswrapper[4636]: I1003 14:19:47.691941 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:47 crc kubenswrapper[4636]: I1003 14:19:47.692294 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:19:47 crc kubenswrapper[4636]: I1003 14:19:47.973642 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:47 crc kubenswrapper[4636]: I1003 14:19:47.973679 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:19:48 crc kubenswrapper[4636]: I1003 14:19:48.096083 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdd6b4ee-372a-42a0-a353-b3a82463d3ff","Type":"ContainerStarted","Data":"68d2c7efcfa8d8d9cdbf850cb801611634bb7eff15d0eba7ccfa1d44eb43db54"} Oct 03 14:19:48 crc kubenswrapper[4636]: I1003 14:19:48.124412 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:48 crc kubenswrapper[4636]: I1003 14:19:48.124456 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:48 crc kubenswrapper[4636]: I1003 14:19:48.204312 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:48 crc kubenswrapper[4636]: I1003 14:19:48.204800 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:49 crc kubenswrapper[4636]: I1003 14:19:49.107190 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:49 crc kubenswrapper[4636]: I1003 14:19:49.107603 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:50 crc kubenswrapper[4636]: I1003 14:19:50.142481 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdd6b4ee-372a-42a0-a353-b3a82463d3ff","Type":"ContainerStarted","Data":"4c704ee2e692fc51c22595e16d7e80038cffaa667db593234f019af812029b0c"} Oct 03 14:19:51 crc kubenswrapper[4636]: I1003 14:19:51.149986 4636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:19:55 crc kubenswrapper[4636]: I1003 14:19:55.186520 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ba9d4a1-300f-4367-ba2e-528ed4635dfd","Type":"ContainerStarted","Data":"eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b"} Oct 03 14:19:55 crc kubenswrapper[4636]: I1003 14:19:55.188464 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdd6b4ee-372a-42a0-a353-b3a82463d3ff","Type":"ContainerStarted","Data":"eeec22460d487729f2abc1d313a5bd005c59d08335b1724f0b02955f839bb9f1"} Oct 03 14:19:55 crc kubenswrapper[4636]: I1003 14:19:55.206865 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.206847344 podStartE2EDuration="9.206847344s" podCreationTimestamp="2025-10-03 14:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:19:55.205508649 +0000 UTC m=+1145.064234916" watchObservedRunningTime="2025-10-03 14:19:55.206847344 +0000 UTC m=+1145.065573591" Oct 03 14:19:56 crc kubenswrapper[4636]: I1003 14:19:56.559544 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 14:19:56 crc kubenswrapper[4636]: I1003 14:19:56.560143 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 14:19:56 crc kubenswrapper[4636]: I1003 14:19:56.602773 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 14:19:56 crc kubenswrapper[4636]: I1003 14:19:56.619653 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 14:19:57 crc kubenswrapper[4636]: I1003 14:19:57.214852 4636 generic.go:334] "Generic (PLEG): container finished" podID="9e686877-1f2c-4049-8f72-2788c4ff74b8" containerID="a895fd68ed7ee228433cf4e51a3f47a322af9549d5459155ccc3bc9373dd2cd5" exitCode=0 Oct 03 14:19:57 crc kubenswrapper[4636]: I1003 14:19:57.215420 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-254t7" event={"ID":"9e686877-1f2c-4049-8f72-2788c4ff74b8","Type":"ContainerDied","Data":"a895fd68ed7ee228433cf4e51a3f47a322af9549d5459155ccc3bc9373dd2cd5"} Oct 03 14:19:57 crc kubenswrapper[4636]: I1003 14:19:57.215907 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 14:19:57 crc kubenswrapper[4636]: I1003 14:19:57.215942 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 14:19:57 crc kubenswrapper[4636]: I1003 14:19:57.694475 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7976d47688-kx5v5" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 03 14:19:57 crc kubenswrapper[4636]: I1003 14:19:57.974920 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8c5bc9456-rfvns" podUID="0025da7c-17f3-4036-a9fc-3330508c11cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.229435 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5dc9p" event={"ID":"0eb04b62-9d0b-4dda-aff1-022bed4af5b4","Type":"ContainerStarted","Data":"f0629691999792afdb898c8aa1583f7b512e37775489d5965de17947ce8238f7"} Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.256452 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-5dc9p" podStartSLOduration=3.892321571 podStartE2EDuration="1m1.256432694s" podCreationTimestamp="2025-10-03 14:18:57 +0000 UTC" firstStartedPulling="2025-10-03 14:18:59.716534108 +0000 UTC m=+1089.575260355" lastFinishedPulling="2025-10-03 14:19:57.080645231 +0000 UTC m=+1146.939371478" observedRunningTime="2025-10-03 14:19:58.245584482 +0000 UTC m=+1148.104310719" watchObservedRunningTime="2025-10-03 14:19:58.256432694 +0000 UTC m=+1148.115158941" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.559229 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.559660 4636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.567476 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.666747 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.807056 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-scripts\") pod \"9e686877-1f2c-4049-8f72-2788c4ff74b8\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.807165 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-combined-ca-bundle\") pod \"9e686877-1f2c-4049-8f72-2788c4ff74b8\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.807195 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-config-data\") pod \"9e686877-1f2c-4049-8f72-2788c4ff74b8\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.807219 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cngb\" (UniqueName: \"kubernetes.io/projected/9e686877-1f2c-4049-8f72-2788c4ff74b8-kube-api-access-9cngb\") pod \"9e686877-1f2c-4049-8f72-2788c4ff74b8\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.807281 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-credential-keys\") pod \"9e686877-1f2c-4049-8f72-2788c4ff74b8\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.807340 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-fernet-keys\") pod \"9e686877-1f2c-4049-8f72-2788c4ff74b8\" (UID: \"9e686877-1f2c-4049-8f72-2788c4ff74b8\") " Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.813021 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9e686877-1f2c-4049-8f72-2788c4ff74b8" (UID: "9e686877-1f2c-4049-8f72-2788c4ff74b8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.816695 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9e686877-1f2c-4049-8f72-2788c4ff74b8" (UID: "9e686877-1f2c-4049-8f72-2788c4ff74b8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.828521 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e686877-1f2c-4049-8f72-2788c4ff74b8-kube-api-access-9cngb" (OuterVolumeSpecName: "kube-api-access-9cngb") pod "9e686877-1f2c-4049-8f72-2788c4ff74b8" (UID: "9e686877-1f2c-4049-8f72-2788c4ff74b8"). InnerVolumeSpecName "kube-api-access-9cngb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.837164 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-scripts" (OuterVolumeSpecName: "scripts") pod "9e686877-1f2c-4049-8f72-2788c4ff74b8" (UID: "9e686877-1f2c-4049-8f72-2788c4ff74b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.845994 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e686877-1f2c-4049-8f72-2788c4ff74b8" (UID: "9e686877-1f2c-4049-8f72-2788c4ff74b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.860301 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-config-data" (OuterVolumeSpecName: "config-data") pod "9e686877-1f2c-4049-8f72-2788c4ff74b8" (UID: "9e686877-1f2c-4049-8f72-2788c4ff74b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.910692 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.910730 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.910739 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cngb\" (UniqueName: \"kubernetes.io/projected/9e686877-1f2c-4049-8f72-2788c4ff74b8-kube-api-access-9cngb\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.910751 4636 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.910759 4636 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:58 crc kubenswrapper[4636]: I1003 14:19:58.910768 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e686877-1f2c-4049-8f72-2788c4ff74b8-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.241877 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-254t7" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.241920 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-254t7" event={"ID":"9e686877-1f2c-4049-8f72-2788c4ff74b8","Type":"ContainerDied","Data":"f46a0f53bb2bbc683d0e74f4ae2fc16c06ca7b7ca12927cfe1cd9d0d8cf74d49"} Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.241943 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f46a0f53bb2bbc683d0e74f4ae2fc16c06ca7b7ca12927cfe1cd9d0d8cf74d49" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.333857 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7b4f64b6bf-z54p6"] Oct 03 14:19:59 crc kubenswrapper[4636]: E1003 14:19:59.334541 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e686877-1f2c-4049-8f72-2788c4ff74b8" containerName="keystone-bootstrap" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.334640 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e686877-1f2c-4049-8f72-2788c4ff74b8" containerName="keystone-bootstrap" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.334925 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e686877-1f2c-4049-8f72-2788c4ff74b8" containerName="keystone-bootstrap" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.335701 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.340512 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.340782 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.342557 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.345520 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.345644 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.351767 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l2krf" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.370949 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b4f64b6bf-z54p6"] Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.425248 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-public-tls-certs\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.425290 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-442k7\" (UniqueName: \"kubernetes.io/projected/5ddc1097-69d8-4db3-93f1-a43038191aae-kube-api-access-442k7\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.425319 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-credential-keys\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.425349 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-config-data\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.425375 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-combined-ca-bundle\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.425413 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-internal-tls-certs\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.425439 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-fernet-keys\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.425463 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-scripts\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.527307 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-fernet-keys\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.527359 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-scripts\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.527468 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-442k7\" (UniqueName: \"kubernetes.io/projected/5ddc1097-69d8-4db3-93f1-a43038191aae-kube-api-access-442k7\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.527486 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-public-tls-certs\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.528406 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-credential-keys\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.528485 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-config-data\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.528551 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-combined-ca-bundle\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.528710 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-internal-tls-certs\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.532838 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-fernet-keys\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.533213 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-internal-tls-certs\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.539970 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-public-tls-certs\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.541441 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-scripts\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.542063 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-config-data\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.542218 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-combined-ca-bundle\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.552783 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-442k7\" (UniqueName: \"kubernetes.io/projected/5ddc1097-69d8-4db3-93f1-a43038191aae-kube-api-access-442k7\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.572381 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ddc1097-69d8-4db3-93f1-a43038191aae-credential-keys\") pod \"keystone-7b4f64b6bf-z54p6\" (UID: \"5ddc1097-69d8-4db3-93f1-a43038191aae\") " pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:19:59 crc kubenswrapper[4636]: I1003 14:19:59.652559 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:20:00 crc kubenswrapper[4636]: I1003 14:20:00.295520 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b4f64b6bf-z54p6"] Oct 03 14:20:00 crc kubenswrapper[4636]: W1003 14:20:00.325584 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ddc1097_69d8_4db3_93f1_a43038191aae.slice/crio-9f09cabe1bebc5f758e79a2c41b35b08c8f79230ce367950f8d74c03a761f256 WatchSource:0}: Error finding container 9f09cabe1bebc5f758e79a2c41b35b08c8f79230ce367950f8d74c03a761f256: Status 404 returned error can't find the container with id 9f09cabe1bebc5f758e79a2c41b35b08c8f79230ce367950f8d74c03a761f256 Oct 03 14:20:01 crc kubenswrapper[4636]: I1003 14:20:01.261238 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b4f64b6bf-z54p6" event={"ID":"5ddc1097-69d8-4db3-93f1-a43038191aae","Type":"ContainerStarted","Data":"9f09cabe1bebc5f758e79a2c41b35b08c8f79230ce367950f8d74c03a761f256"} Oct 03 14:20:01 crc kubenswrapper[4636]: I1003 14:20:01.269072 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 14:20:02 crc kubenswrapper[4636]: I1003 14:20:02.271939 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b4f64b6bf-z54p6" event={"ID":"5ddc1097-69d8-4db3-93f1-a43038191aae","Type":"ContainerStarted","Data":"60b964cabbd058a78b63f3014ba12d0b8bfd45efa853b36cb7fed336963c3862"} Oct 03 14:20:02 crc kubenswrapper[4636]: I1003 14:20:02.272191 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:20:02 crc kubenswrapper[4636]: I1003 14:20:02.291687 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7b4f64b6bf-z54p6" podStartSLOduration=3.291670888 podStartE2EDuration="3.291670888s" podCreationTimestamp="2025-10-03 14:19:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:02.288214528 +0000 UTC m=+1152.146940785" watchObservedRunningTime="2025-10-03 14:20:02.291670888 +0000 UTC m=+1152.150397135" Oct 03 14:20:03 crc kubenswrapper[4636]: I1003 14:20:03.665267 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 14:20:06 crc kubenswrapper[4636]: I1003 14:20:06.308441 4636 generic.go:334] "Generic (PLEG): container finished" podID="02e452cc-6659-4abd-88ff-d9e731b9b1ef" containerID="e186ebaf19345cada0ff2fcd09f9e68a719a10444ee9ec1f7097c7a5988eb3eb" exitCode=0 Oct 03 14:20:06 crc kubenswrapper[4636]: I1003 14:20:06.309032 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hv2wz" event={"ID":"02e452cc-6659-4abd-88ff-d9e731b9b1ef","Type":"ContainerDied","Data":"e186ebaf19345cada0ff2fcd09f9e68a719a10444ee9ec1f7097c7a5988eb3eb"} Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.680574 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hv2wz" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.692418 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7976d47688-kx5v5" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 03 14:20:07 crc kubenswrapper[4636]: E1003 14:20:07.781329 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.787631 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfcb5\" (UniqueName: \"kubernetes.io/projected/02e452cc-6659-4abd-88ff-d9e731b9b1ef-kube-api-access-dfcb5\") pod \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.787689 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-config-data\") pod \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.787725 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-scripts\") pod \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.787790 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e452cc-6659-4abd-88ff-d9e731b9b1ef-logs\") pod \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.787873 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-combined-ca-bundle\") pod \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\" (UID: \"02e452cc-6659-4abd-88ff-d9e731b9b1ef\") " Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.790174 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e452cc-6659-4abd-88ff-d9e731b9b1ef-logs" (OuterVolumeSpecName: "logs") pod "02e452cc-6659-4abd-88ff-d9e731b9b1ef" (UID: "02e452cc-6659-4abd-88ff-d9e731b9b1ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.794424 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e452cc-6659-4abd-88ff-d9e731b9b1ef-kube-api-access-dfcb5" (OuterVolumeSpecName: "kube-api-access-dfcb5") pod "02e452cc-6659-4abd-88ff-d9e731b9b1ef" (UID: "02e452cc-6659-4abd-88ff-d9e731b9b1ef"). InnerVolumeSpecName "kube-api-access-dfcb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.796298 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-scripts" (OuterVolumeSpecName: "scripts") pod "02e452cc-6659-4abd-88ff-d9e731b9b1ef" (UID: "02e452cc-6659-4abd-88ff-d9e731b9b1ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.829041 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02e452cc-6659-4abd-88ff-d9e731b9b1ef" (UID: "02e452cc-6659-4abd-88ff-d9e731b9b1ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.831390 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-config-data" (OuterVolumeSpecName: "config-data") pod "02e452cc-6659-4abd-88ff-d9e731b9b1ef" (UID: "02e452cc-6659-4abd-88ff-d9e731b9b1ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.890325 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfcb5\" (UniqueName: \"kubernetes.io/projected/02e452cc-6659-4abd-88ff-d9e731b9b1ef-kube-api-access-dfcb5\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.890363 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.890378 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.890387 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e452cc-6659-4abd-88ff-d9e731b9b1ef-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.890396 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e452cc-6659-4abd-88ff-d9e731b9b1ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:07 crc kubenswrapper[4636]: I1003 14:20:07.974304 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8c5bc9456-rfvns" podUID="0025da7c-17f3-4036-a9fc-3330508c11cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.327916 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hv2wz" event={"ID":"02e452cc-6659-4abd-88ff-d9e731b9b1ef","Type":"ContainerDied","Data":"260bb49d2f7974e7ebca0e537a869bd83d389cd27321b60cd5b91aa6ec1bf83c"} Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.327978 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="260bb49d2f7974e7ebca0e537a869bd83d389cd27321b60cd5b91aa6ec1bf83c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.327938 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hv2wz" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.329706 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zgg4h" event={"ID":"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49","Type":"ContainerStarted","Data":"18a48e0b0583ccc1d96ce521f6e96af8c5680842f0f276483d011fab120d4498"} Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.332571 4636 generic.go:334] "Generic (PLEG): container finished" podID="0eb04b62-9d0b-4dda-aff1-022bed4af5b4" containerID="f0629691999792afdb898c8aa1583f7b512e37775489d5965de17947ce8238f7" exitCode=0 Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.332629 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5dc9p" event={"ID":"0eb04b62-9d0b-4dda-aff1-022bed4af5b4","Type":"ContainerDied","Data":"f0629691999792afdb898c8aa1583f7b512e37775489d5965de17947ce8238f7"} Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.334779 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ba9d4a1-300f-4367-ba2e-528ed4635dfd","Type":"ContainerStarted","Data":"3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b"} Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.334935 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="ceilometer-notification-agent" containerID="cri-o://6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2" gracePeriod=30 Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.335123 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.335180 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="proxy-httpd" containerID="cri-o://3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b" gracePeriod=30 Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.335242 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="sg-core" containerID="cri-o://eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b" gracePeriod=30 Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.370265 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-zgg4h" podStartSLOduration=4.681262677 podStartE2EDuration="1m12.370242394s" podCreationTimestamp="2025-10-03 14:18:56 +0000 UTC" firstStartedPulling="2025-10-03 14:18:59.625780608 +0000 UTC m=+1089.484506855" lastFinishedPulling="2025-10-03 14:20:07.314760325 +0000 UTC m=+1157.173486572" observedRunningTime="2025-10-03 14:20:08.357490372 +0000 UTC m=+1158.216216629" watchObservedRunningTime="2025-10-03 14:20:08.370242394 +0000 UTC m=+1158.228968641" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.489649 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6796cf444-9xs6c"] Oct 03 14:20:08 crc kubenswrapper[4636]: E1003 14:20:08.495197 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e452cc-6659-4abd-88ff-d9e731b9b1ef" containerName="placement-db-sync" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.495406 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e452cc-6659-4abd-88ff-d9e731b9b1ef" containerName="placement-db-sync" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.495754 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e452cc-6659-4abd-88ff-d9e731b9b1ef" containerName="placement-db-sync" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.496974 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.502158 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.502592 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.502784 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.504023 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.504184 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lsrxj" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.508491 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6796cf444-9xs6c"] Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.508915 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-internal-tls-certs\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.508961 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-logs\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.508995 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-combined-ca-bundle\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.509022 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-scripts\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.509056 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc89x\" (UniqueName: \"kubernetes.io/projected/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-kube-api-access-rc89x\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.509084 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-config-data\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.509120 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-public-tls-certs\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.610393 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-internal-tls-certs\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.611203 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-logs\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.611256 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-combined-ca-bundle\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.611298 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-scripts\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.611361 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc89x\" (UniqueName: \"kubernetes.io/projected/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-kube-api-access-rc89x\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.611410 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-config-data\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.611446 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-public-tls-certs\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.612126 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-logs\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.615998 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-public-tls-certs\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.616164 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-config-data\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.618539 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-scripts\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.628656 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-internal-tls-certs\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.635345 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-combined-ca-bundle\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.636075 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc89x\" (UniqueName: \"kubernetes.io/projected/7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2-kube-api-access-rc89x\") pod \"placement-6796cf444-9xs6c\" (UID: \"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2\") " pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:08 crc kubenswrapper[4636]: I1003 14:20:08.817480 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.163181 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.163525 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.344519 4636 generic.go:334] "Generic (PLEG): container finished" podID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerID="3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b" exitCode=0 Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.344561 4636 generic.go:334] "Generic (PLEG): container finished" podID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerID="eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b" exitCode=2 Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.344726 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ba9d4a1-300f-4367-ba2e-528ed4635dfd","Type":"ContainerDied","Data":"3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b"} Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.344756 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ba9d4a1-300f-4367-ba2e-528ed4635dfd","Type":"ContainerDied","Data":"eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b"} Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.384633 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6796cf444-9xs6c"] Oct 03 14:20:09 crc kubenswrapper[4636]: W1003 14:20:09.404843 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ab1d10b_e0d0_426a_a90f_6f8969e3c8b2.slice/crio-e7278d42f55ccb6600114dcdd381b0f87bbce149e476d48e81ee142a5209bb0e WatchSource:0}: Error finding container e7278d42f55ccb6600114dcdd381b0f87bbce149e476d48e81ee142a5209bb0e: Status 404 returned error can't find the container with id e7278d42f55ccb6600114dcdd381b0f87bbce149e476d48e81ee142a5209bb0e Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.692631 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.740777 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-combined-ca-bundle\") pod \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.741910 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-db-sync-config-data\") pod \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.742088 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h94q\" (UniqueName: \"kubernetes.io/projected/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-kube-api-access-2h94q\") pod \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\" (UID: \"0eb04b62-9d0b-4dda-aff1-022bed4af5b4\") " Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.749378 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0eb04b62-9d0b-4dda-aff1-022bed4af5b4" (UID: "0eb04b62-9d0b-4dda-aff1-022bed4af5b4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.756396 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-kube-api-access-2h94q" (OuterVolumeSpecName: "kube-api-access-2h94q") pod "0eb04b62-9d0b-4dda-aff1-022bed4af5b4" (UID: "0eb04b62-9d0b-4dda-aff1-022bed4af5b4"). InnerVolumeSpecName "kube-api-access-2h94q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.775346 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0eb04b62-9d0b-4dda-aff1-022bed4af5b4" (UID: "0eb04b62-9d0b-4dda-aff1-022bed4af5b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.844894 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h94q\" (UniqueName: \"kubernetes.io/projected/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-kube-api-access-2h94q\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.844939 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:09 crc kubenswrapper[4636]: I1003 14:20:09.844953 4636 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0eb04b62-9d0b-4dda-aff1-022bed4af5b4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.332496 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.364407 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5dc9p" event={"ID":"0eb04b62-9d0b-4dda-aff1-022bed4af5b4","Type":"ContainerDied","Data":"75786b3cc242fbdb05787fd9d6e3a668f9fb301f1f00b3436c46aba2dbcd68a2"} Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.364443 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75786b3cc242fbdb05787fd9d6e3a668f9fb301f1f00b3436c46aba2dbcd68a2" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.364494 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5dc9p" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.376997 4636 generic.go:334] "Generic (PLEG): container finished" podID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerID="6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2" exitCode=0 Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.377132 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ba9d4a1-300f-4367-ba2e-528ed4635dfd","Type":"ContainerDied","Data":"6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2"} Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.377164 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ba9d4a1-300f-4367-ba2e-528ed4635dfd","Type":"ContainerDied","Data":"b0705a02839f38e6a5797d69aecce8e13f3afd05a7971c0f43da1d0795b0bc90"} Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.377189 4636 scope.go:117] "RemoveContainer" containerID="3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.377198 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.386007 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6796cf444-9xs6c" event={"ID":"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2","Type":"ContainerStarted","Data":"cfe14936ddc3c3465fda24a2c6c6c49c8a5fc0d69454f3872267abe105af3f2c"} Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.386125 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6796cf444-9xs6c" event={"ID":"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2","Type":"ContainerStarted","Data":"46264f2da058a05f3becfec82d8e5f47d7f3cf084242e7aa4235e713bf84e475"} Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.386157 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6796cf444-9xs6c" event={"ID":"7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2","Type":"ContainerStarted","Data":"e7278d42f55ccb6600114dcdd381b0f87bbce149e476d48e81ee142a5209bb0e"} Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.386535 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.386561 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.389107 4636 generic.go:334] "Generic (PLEG): container finished" podID="7d2a38ef-2fad-4a66-a131-2f690ceb72f1" containerID="ea54d616c2574efe0fca015fd7163ec725f13d0b9b15a84b7a1a06c0339a7c22" exitCode=0 Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.389145 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-97znq" event={"ID":"7d2a38ef-2fad-4a66-a131-2f690ceb72f1","Type":"ContainerDied","Data":"ea54d616c2574efe0fca015fd7163ec725f13d0b9b15a84b7a1a06c0339a7c22"} Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.449860 4636 scope.go:117] "RemoveContainer" containerID="eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.455956 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6796cf444-9xs6c" podStartSLOduration=2.455928284 podStartE2EDuration="2.455928284s" podCreationTimestamp="2025-10-03 14:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:10.43655088 +0000 UTC m=+1160.295277127" watchObservedRunningTime="2025-10-03 14:20:10.455928284 +0000 UTC m=+1160.314654531" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.459754 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-sg-core-conf-yaml\") pod \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.459999 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-config-data\") pod \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.464419 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-log-httpd\") pod \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.465773 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6ba9d4a1-300f-4367-ba2e-528ed4635dfd" (UID: "6ba9d4a1-300f-4367-ba2e-528ed4635dfd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.466211 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-scripts\") pod \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.466561 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-combined-ca-bundle\") pod \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.466608 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5pgm\" (UniqueName: \"kubernetes.io/projected/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-kube-api-access-l5pgm\") pod \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.466694 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-run-httpd\") pod \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\" (UID: \"6ba9d4a1-300f-4367-ba2e-528ed4635dfd\") " Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.468301 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6ba9d4a1-300f-4367-ba2e-528ed4635dfd" (UID: "6ba9d4a1-300f-4367-ba2e-528ed4635dfd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.474275 4636 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.474832 4636 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.479301 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-scripts" (OuterVolumeSpecName: "scripts") pod "6ba9d4a1-300f-4367-ba2e-528ed4635dfd" (UID: "6ba9d4a1-300f-4367-ba2e-528ed4635dfd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.485303 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-kube-api-access-l5pgm" (OuterVolumeSpecName: "kube-api-access-l5pgm") pod "6ba9d4a1-300f-4367-ba2e-528ed4635dfd" (UID: "6ba9d4a1-300f-4367-ba2e-528ed4635dfd"). InnerVolumeSpecName "kube-api-access-l5pgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.485485 4636 scope.go:117] "RemoveContainer" containerID="6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.492244 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6ba9d4a1-300f-4367-ba2e-528ed4635dfd" (UID: "6ba9d4a1-300f-4367-ba2e-528ed4635dfd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.526203 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ba9d4a1-300f-4367-ba2e-528ed4635dfd" (UID: "6ba9d4a1-300f-4367-ba2e-528ed4635dfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.578671 4636 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.578700 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.578710 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.578719 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5pgm\" (UniqueName: \"kubernetes.io/projected/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-kube-api-access-l5pgm\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.590248 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-config-data" (OuterVolumeSpecName: "config-data") pod "6ba9d4a1-300f-4367-ba2e-528ed4635dfd" (UID: "6ba9d4a1-300f-4367-ba2e-528ed4635dfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.679915 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7f6878bdf6-2vf98"] Oct 03 14:20:10 crc kubenswrapper[4636]: E1003 14:20:10.680638 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="sg-core" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.680732 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="sg-core" Oct 03 14:20:10 crc kubenswrapper[4636]: E1003 14:20:10.680827 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb04b62-9d0b-4dda-aff1-022bed4af5b4" containerName="barbican-db-sync" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.680898 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb04b62-9d0b-4dda-aff1-022bed4af5b4" containerName="barbican-db-sync" Oct 03 14:20:10 crc kubenswrapper[4636]: E1003 14:20:10.681002 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="proxy-httpd" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.681081 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="proxy-httpd" Oct 03 14:20:10 crc kubenswrapper[4636]: E1003 14:20:10.681193 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="ceilometer-notification-agent" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.681262 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="ceilometer-notification-agent" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.681525 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="proxy-httpd" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.682700 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="ceilometer-notification-agent" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.682828 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" containerName="sg-core" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.682908 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb04b62-9d0b-4dda-aff1-022bed4af5b4" containerName="barbican-db-sync" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.681720 4636 scope.go:117] "RemoveContainer" containerID="3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.684782 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba9d4a1-300f-4367-ba2e-528ed4635dfd-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.684911 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: E1003 14:20:10.690674 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b\": container with ID starting with 3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b not found: ID does not exist" containerID="3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.690721 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b"} err="failed to get container status \"3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b\": rpc error: code = NotFound desc = could not find container \"3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b\": container with ID starting with 3b75947feb19706c3e79b90e578326314be1b3fa669a5b3e96558305af9b2f2b not found: ID does not exist" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.690752 4636 scope.go:117] "RemoveContainer" containerID="eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.694536 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.694786 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 14:20:10 crc kubenswrapper[4636]: E1003 14:20:10.695491 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b\": container with ID starting with eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b not found: ID does not exist" containerID="eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.695524 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b"} err="failed to get container status \"eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b\": rpc error: code = NotFound desc = could not find container \"eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b\": container with ID starting with eabf2762dee0eff5ecb5cecd5dfc7ea51b47118e99303917c50598058e9d8c8b not found: ID does not exist" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.695552 4636 scope.go:117] "RemoveContainer" containerID="6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.695856 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qjlnd" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.696925 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6f76699687-g9k2g"] Oct 03 14:20:10 crc kubenswrapper[4636]: E1003 14:20:10.703432 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2\": container with ID starting with 6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2 not found: ID does not exist" containerID="6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.703757 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2"} err="failed to get container status \"6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2\": rpc error: code = NotFound desc = could not find container \"6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2\": container with ID starting with 6d6682058ef934697b8fa17d5d6c6d2a077c3a6fe5e5007f59272a5198efa3b2 not found: ID does not exist" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.710900 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.712649 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.737582 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f76699687-g9k2g"] Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.767204 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f6878bdf6-2vf98"] Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.786609 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-config-data-custom\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.786674 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-config-data\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.786737 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-combined-ca-bundle\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.786769 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwhlx\" (UniqueName: \"kubernetes.io/projected/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-kube-api-access-hwhlx\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.786844 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-logs\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.850153 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-px7xz"] Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.852187 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.894866 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.895830 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/596e3078-e359-4e8d-a7c0-74c710f2c2f9-config-data-custom\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.895989 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-config\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.896109 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-logs\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.896202 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.896319 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.896424 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596e3078-e359-4e8d-a7c0-74c710f2c2f9-config-data\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.896516 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.896614 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-config-data-custom\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.896695 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596e3078-e359-4e8d-a7c0-74c710f2c2f9-logs\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.896789 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-config-data\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.896885 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mcqq\" (UniqueName: \"kubernetes.io/projected/596e3078-e359-4e8d-a7c0-74c710f2c2f9-kube-api-access-8mcqq\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.896978 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgvnx\" (UniqueName: \"kubernetes.io/projected/e78065e8-9570-4ebb-8bda-4e03a586d97e-kube-api-access-dgvnx\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.897077 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596e3078-e359-4e8d-a7c0-74c710f2c2f9-combined-ca-bundle\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.897180 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-combined-ca-bundle\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.897280 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwhlx\" (UniqueName: \"kubernetes.io/projected/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-kube-api-access-hwhlx\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.897390 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.897814 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-logs\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.909626 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-combined-ca-bundle\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.915134 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-config-data\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.916369 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-config-data-custom\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.940082 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.954800 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwhlx\" (UniqueName: \"kubernetes.io/projected/9cf408a6-c7e6-4bf3-80c6-47cc10bec465-kube-api-access-hwhlx\") pod \"barbican-keystone-listener-7f6878bdf6-2vf98\" (UID: \"9cf408a6-c7e6-4bf3-80c6-47cc10bec465\") " pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.975157 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-px7xz"] Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.997495 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.998997 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.999042 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/596e3078-e359-4e8d-a7c0-74c710f2c2f9-config-data-custom\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.999065 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-config\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.999114 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.999165 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.999210 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596e3078-e359-4e8d-a7c0-74c710f2c2f9-config-data\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.999230 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.999247 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596e3078-e359-4e8d-a7c0-74c710f2c2f9-logs\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.999274 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mcqq\" (UniqueName: \"kubernetes.io/projected/596e3078-e359-4e8d-a7c0-74c710f2c2f9-kube-api-access-8mcqq\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.999294 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgvnx\" (UniqueName: \"kubernetes.io/projected/e78065e8-9570-4ebb-8bda-4e03a586d97e-kube-api-access-dgvnx\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.999324 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596e3078-e359-4e8d-a7c0-74c710f2c2f9-combined-ca-bundle\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:10 crc kubenswrapper[4636]: I1003 14:20:10.999629 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.001297 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.001824 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.005616 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596e3078-e359-4e8d-a7c0-74c710f2c2f9-logs\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.006014 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.006264 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.007842 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-config\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.008772 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.009302 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.014898 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596e3078-e359-4e8d-a7c0-74c710f2c2f9-combined-ca-bundle\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.014968 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.017403 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/596e3078-e359-4e8d-a7c0-74c710f2c2f9-config-data-custom\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.020011 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596e3078-e359-4e8d-a7c0-74c710f2c2f9-config-data\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.039066 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgvnx\" (UniqueName: \"kubernetes.io/projected/e78065e8-9570-4ebb-8bda-4e03a586d97e-kube-api-access-dgvnx\") pod \"dnsmasq-dns-586bdc5f9-px7xz\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.051141 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mcqq\" (UniqueName: \"kubernetes.io/projected/596e3078-e359-4e8d-a7c0-74c710f2c2f9-kube-api-access-8mcqq\") pod \"barbican-worker-6f76699687-g9k2g\" (UID: \"596e3078-e359-4e8d-a7c0-74c710f2c2f9\") " pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.058051 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57b58d64fd-vdxfd"] Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.058202 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.060058 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f76699687-g9k2g" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.060083 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.067364 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57b58d64fd-vdxfd"] Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.076324 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.101503 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.101566 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615f5c9e-d9ff-4193-9477-478118e04b99-logs\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.101603 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.101637 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-scripts\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.101670 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data-custom\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.101703 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-run-httpd\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.101926 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-log-httpd\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.101996 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9s77\" (UniqueName: \"kubernetes.io/projected/615f5c9e-d9ff-4193-9477-478118e04b99-kube-api-access-l9s77\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.102053 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-config-data\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.102230 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9zv5\" (UniqueName: \"kubernetes.io/projected/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-kube-api-access-h9zv5\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.102300 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-combined-ca-bundle\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.102355 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203148 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-combined-ca-bundle\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203360 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203412 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203429 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615f5c9e-d9ff-4193-9477-478118e04b99-logs\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203458 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203479 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-scripts\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203497 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data-custom\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203518 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-run-httpd\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203546 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-log-httpd\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203579 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9s77\" (UniqueName: \"kubernetes.io/projected/615f5c9e-d9ff-4193-9477-478118e04b99-kube-api-access-l9s77\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203593 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-config-data\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.203628 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9zv5\" (UniqueName: \"kubernetes.io/projected/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-kube-api-access-h9zv5\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.205225 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-log-httpd\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.208834 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615f5c9e-d9ff-4193-9477-478118e04b99-logs\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.208892 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-run-httpd\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.212145 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data-custom\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.213113 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.215797 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.216936 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-combined-ca-bundle\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.217256 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.224934 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9zv5\" (UniqueName: \"kubernetes.io/projected/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-kube-api-access-h9zv5\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.225521 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-scripts\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.226308 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.231848 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-config-data\") pod \"ceilometer-0\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.245432 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9s77\" (UniqueName: \"kubernetes.io/projected/615f5c9e-d9ff-4193-9477-478118e04b99-kube-api-access-l9s77\") pod \"barbican-api-57b58d64fd-vdxfd\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.323674 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.384070 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.478971 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f76699687-g9k2g"] Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.582354 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f6878bdf6-2vf98"] Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.952493 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-97znq" Oct 03 14:20:11 crc kubenswrapper[4636]: I1003 14:20:11.958783 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-px7xz"] Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.120803 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-combined-ca-bundle\") pod \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.121154 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-config\") pod \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.121208 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgnbs\" (UniqueName: \"kubernetes.io/projected/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-kube-api-access-hgnbs\") pod \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\" (UID: \"7d2a38ef-2fad-4a66-a131-2f690ceb72f1\") " Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.127231 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-kube-api-access-hgnbs" (OuterVolumeSpecName: "kube-api-access-hgnbs") pod "7d2a38ef-2fad-4a66-a131-2f690ceb72f1" (UID: "7d2a38ef-2fad-4a66-a131-2f690ceb72f1"). InnerVolumeSpecName "kube-api-access-hgnbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.151403 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-config" (OuterVolumeSpecName: "config") pod "7d2a38ef-2fad-4a66-a131-2f690ceb72f1" (UID: "7d2a38ef-2fad-4a66-a131-2f690ceb72f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.167026 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d2a38ef-2fad-4a66-a131-2f690ceb72f1" (UID: "7d2a38ef-2fad-4a66-a131-2f690ceb72f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.223585 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.223617 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.223628 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgnbs\" (UniqueName: \"kubernetes.io/projected/7d2a38ef-2fad-4a66-a131-2f690ceb72f1-kube-api-access-hgnbs\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.266224 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57b58d64fd-vdxfd"] Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.274828 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.490405 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-97znq" event={"ID":"7d2a38ef-2fad-4a66-a131-2f690ceb72f1","Type":"ContainerDied","Data":"9e0e7f5e08ba646f03fbb03f2233f2e2ec5fda676b9dcecd70f12618c0182f05"} Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.490463 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e0e7f5e08ba646f03fbb03f2233f2e2ec5fda676b9dcecd70f12618c0182f05" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.490430 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-97znq" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.491889 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58d64fd-vdxfd" event={"ID":"615f5c9e-d9ff-4193-9477-478118e04b99","Type":"ContainerStarted","Data":"db74f7afbd8c8386b3ff93c4bb1db3057fade10a3aa767b9648c989888aca581"} Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.494469 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" event={"ID":"9cf408a6-c7e6-4bf3-80c6-47cc10bec465","Type":"ContainerStarted","Data":"14cbddb8642f1b376525f3fc2dc4bd3875213b089ab3185f8ef69b94e7f48e88"} Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.496361 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8","Type":"ContainerStarted","Data":"2dbcecde24a5d7471fdb4745e0eb311375f9ea933f7baedd22c8e24617070e90"} Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.498237 4636 generic.go:334] "Generic (PLEG): container finished" podID="e78065e8-9570-4ebb-8bda-4e03a586d97e" containerID="7de68dd85588528f3106d0815601d243439391ab986b40cd7d08944e76448a74" exitCode=0 Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.498301 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" event={"ID":"e78065e8-9570-4ebb-8bda-4e03a586d97e","Type":"ContainerDied","Data":"7de68dd85588528f3106d0815601d243439391ab986b40cd7d08944e76448a74"} Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.498330 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" event={"ID":"e78065e8-9570-4ebb-8bda-4e03a586d97e","Type":"ContainerStarted","Data":"7cf8dd246ca7d66f5c9211972216eb6b783c50c186c7f5f83b1c2cebd87a7c28"} Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.504043 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f76699687-g9k2g" event={"ID":"596e3078-e359-4e8d-a7c0-74c710f2c2f9","Type":"ContainerStarted","Data":"cca8af80ccdee26667c0ce168937b9c7d0686d6b76fc2c114f0c4dc788ed2d98"} Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.810042 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba9d4a1-300f-4367-ba2e-528ed4635dfd" path="/var/lib/kubelet/pods/6ba9d4a1-300f-4367-ba2e-528ed4635dfd/volumes" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.833333 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-px7xz"] Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.943308 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-ddslh"] Oct 03 14:20:12 crc kubenswrapper[4636]: E1003 14:20:12.943840 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d2a38ef-2fad-4a66-a131-2f690ceb72f1" containerName="neutron-db-sync" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.943863 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2a38ef-2fad-4a66-a131-2f690ceb72f1" containerName="neutron-db-sync" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.944085 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d2a38ef-2fad-4a66-a131-2f690ceb72f1" containerName="neutron-db-sync" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.945228 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:12 crc kubenswrapper[4636]: I1003 14:20:12.972469 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-ddslh"] Oct 03 14:20:13 crc kubenswrapper[4636]: E1003 14:20:13.034133 4636 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 03 14:20:13 crc kubenswrapper[4636]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/e78065e8-9570-4ebb-8bda-4e03a586d97e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 14:20:13 crc kubenswrapper[4636]: > podSandboxID="7cf8dd246ca7d66f5c9211972216eb6b783c50c186c7f5f83b1c2cebd87a7c28" Oct 03 14:20:13 crc kubenswrapper[4636]: E1003 14:20:13.034537 4636 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 03 14:20:13 crc kubenswrapper[4636]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cch55h5bchd4h54bh78hbfh579h5hc4h67h8ch674h658h5c4hb4h87h6h95h574h65fhb8h659h9bh56bh66fh5c9h65dh594h5f8h77h79q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dgvnx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-586bdc5f9-px7xz_openstack(e78065e8-9570-4ebb-8bda-4e03a586d97e): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/e78065e8-9570-4ebb-8bda-4e03a586d97e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 14:20:13 crc kubenswrapper[4636]: > logger="UnhandledError" Oct 03 14:20:13 crc kubenswrapper[4636]: E1003 14:20:13.036534 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/e78065e8-9570-4ebb-8bda-4e03a586d97e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" podUID="e78065e8-9570-4ebb-8bda-4e03a586d97e" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.044468 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.044655 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.044717 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-svc\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.044759 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.044783 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-config\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.044823 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkzgx\" (UniqueName: \"kubernetes.io/projected/da5e13d2-6aca-4810-bb7e-882f73b4aa33-kube-api-access-hkzgx\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.147520 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-svc\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.147587 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.147612 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-config\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.147654 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkzgx\" (UniqueName: \"kubernetes.io/projected/da5e13d2-6aca-4810-bb7e-882f73b4aa33-kube-api-access-hkzgx\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.147772 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.147835 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.148730 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-svc\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.157900 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.158616 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-config\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.164919 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.165682 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.183883 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkzgx\" (UniqueName: \"kubernetes.io/projected/da5e13d2-6aca-4810-bb7e-882f73b4aa33-kube-api-access-hkzgx\") pod \"dnsmasq-dns-85ff748b95-ddslh\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.261083 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7585984468-sqdbh"] Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.262910 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.267607 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.267820 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bbhhb" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.269871 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.273311 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.276833 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.292522 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7585984468-sqdbh"] Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.352663 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-ovndb-tls-certs\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.352760 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhs26\" (UniqueName: \"kubernetes.io/projected/3616dd77-ea16-43c1-9d40-592fb7226c95-kube-api-access-qhs26\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.352809 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-combined-ca-bundle\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.352881 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-httpd-config\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.352921 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-config\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.469333 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-config\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.469684 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-ovndb-tls-certs\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.469817 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs26\" (UniqueName: \"kubernetes.io/projected/3616dd77-ea16-43c1-9d40-592fb7226c95-kube-api-access-qhs26\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.469892 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-combined-ca-bundle\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.470033 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-httpd-config\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.485277 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-ovndb-tls-certs\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.498764 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-config\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.499678 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-combined-ca-bundle\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.507844 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-httpd-config\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.524391 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhs26\" (UniqueName: \"kubernetes.io/projected/3616dd77-ea16-43c1-9d40-592fb7226c95-kube-api-access-qhs26\") pod \"neutron-7585984468-sqdbh\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.557011 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58d64fd-vdxfd" event={"ID":"615f5c9e-d9ff-4193-9477-478118e04b99","Type":"ContainerStarted","Data":"8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61"} Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.589660 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:13 crc kubenswrapper[4636]: I1003 14:20:13.865610 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-ddslh"] Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.055302 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.110711 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-svc\") pod \"e78065e8-9570-4ebb-8bda-4e03a586d97e\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.110838 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-nb\") pod \"e78065e8-9570-4ebb-8bda-4e03a586d97e\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.110865 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-sb\") pod \"e78065e8-9570-4ebb-8bda-4e03a586d97e\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.110994 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgvnx\" (UniqueName: \"kubernetes.io/projected/e78065e8-9570-4ebb-8bda-4e03a586d97e-kube-api-access-dgvnx\") pod \"e78065e8-9570-4ebb-8bda-4e03a586d97e\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.111068 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-swift-storage-0\") pod \"e78065e8-9570-4ebb-8bda-4e03a586d97e\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.118923 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78065e8-9570-4ebb-8bda-4e03a586d97e-kube-api-access-dgvnx" (OuterVolumeSpecName: "kube-api-access-dgvnx") pod "e78065e8-9570-4ebb-8bda-4e03a586d97e" (UID: "e78065e8-9570-4ebb-8bda-4e03a586d97e"). InnerVolumeSpecName "kube-api-access-dgvnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.126333 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-config\") pod \"e78065e8-9570-4ebb-8bda-4e03a586d97e\" (UID: \"e78065e8-9570-4ebb-8bda-4e03a586d97e\") " Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.127562 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgvnx\" (UniqueName: \"kubernetes.io/projected/e78065e8-9570-4ebb-8bda-4e03a586d97e-kube-api-access-dgvnx\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.374581 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e78065e8-9570-4ebb-8bda-4e03a586d97e" (UID: "e78065e8-9570-4ebb-8bda-4e03a586d97e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.401300 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e78065e8-9570-4ebb-8bda-4e03a586d97e" (UID: "e78065e8-9570-4ebb-8bda-4e03a586d97e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.405836 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e78065e8-9570-4ebb-8bda-4e03a586d97e" (UID: "e78065e8-9570-4ebb-8bda-4e03a586d97e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.413960 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e78065e8-9570-4ebb-8bda-4e03a586d97e" (UID: "e78065e8-9570-4ebb-8bda-4e03a586d97e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.444383 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-config" (OuterVolumeSpecName: "config") pod "e78065e8-9570-4ebb-8bda-4e03a586d97e" (UID: "e78065e8-9570-4ebb-8bda-4e03a586d97e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.446926 4636 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.447016 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.447071 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.447153 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.447216 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e78065e8-9570-4ebb-8bda-4e03a586d97e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.456016 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7585984468-sqdbh"] Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.609346 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" event={"ID":"e78065e8-9570-4ebb-8bda-4e03a586d97e","Type":"ContainerDied","Data":"7cf8dd246ca7d66f5c9211972216eb6b783c50c186c7f5f83b1c2cebd87a7c28"} Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.609407 4636 scope.go:117] "RemoveContainer" containerID="7de68dd85588528f3106d0815601d243439391ab986b40cd7d08944e76448a74" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.609552 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-px7xz" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.624302 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7585984468-sqdbh" event={"ID":"3616dd77-ea16-43c1-9d40-592fb7226c95","Type":"ContainerStarted","Data":"e6967a73fd3f9ba55aa216866f27a89213432b96fe5d45f80679547bf9003346"} Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.635257 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58d64fd-vdxfd" event={"ID":"615f5c9e-d9ff-4193-9477-478118e04b99","Type":"ContainerStarted","Data":"9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f"} Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.636274 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.636336 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.655800 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" event={"ID":"da5e13d2-6aca-4810-bb7e-882f73b4aa33","Type":"ContainerStarted","Data":"383ff5430a786830efcfd2291ee62a10eeead8f9b01686dcdf2c5545c35e522a"} Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.656006 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" event={"ID":"da5e13d2-6aca-4810-bb7e-882f73b4aa33","Type":"ContainerStarted","Data":"493ec061031c8d47a56117dbbac1b15b23bc25551a8360b677f6322e0501e2f8"} Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.703346 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8","Type":"ContainerStarted","Data":"3a88bc976bacd430f39b404dc8dfadf6b2c862db4a5ba83918668bb3ee243d84"} Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.718656 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-px7xz"] Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.747869 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-px7xz"] Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.748111 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57b58d64fd-vdxfd" podStartSLOduration=4.748069482 podStartE2EDuration="4.748069482s" podCreationTimestamp="2025-10-03 14:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:14.729337375 +0000 UTC m=+1164.588063622" watchObservedRunningTime="2025-10-03 14:20:14.748069482 +0000 UTC m=+1164.606795719" Oct 03 14:20:14 crc kubenswrapper[4636]: I1003 14:20:14.847871 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e78065e8-9570-4ebb-8bda-4e03a586d97e" path="/var/lib/kubelet/pods/e78065e8-9570-4ebb-8bda-4e03a586d97e/volumes" Oct 03 14:20:15 crc kubenswrapper[4636]: I1003 14:20:15.716660 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7585984468-sqdbh" event={"ID":"3616dd77-ea16-43c1-9d40-592fb7226c95","Type":"ContainerStarted","Data":"07eaec71e05a6ff3da632c7ad94b5d9a035a32b09a6c3fa3c43ff26e3e26f127"} Oct 03 14:20:15 crc kubenswrapper[4636]: I1003 14:20:15.718438 4636 generic.go:334] "Generic (PLEG): container finished" podID="da5e13d2-6aca-4810-bb7e-882f73b4aa33" containerID="383ff5430a786830efcfd2291ee62a10eeead8f9b01686dcdf2c5545c35e522a" exitCode=0 Oct 03 14:20:15 crc kubenswrapper[4636]: I1003 14:20:15.718500 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" event={"ID":"da5e13d2-6aca-4810-bb7e-882f73b4aa33","Type":"ContainerDied","Data":"383ff5430a786830efcfd2291ee62a10eeead8f9b01686dcdf2c5545c35e522a"} Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.408574 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d7d56d58f-cswwm"] Oct 03 14:20:16 crc kubenswrapper[4636]: E1003 14:20:16.410854 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78065e8-9570-4ebb-8bda-4e03a586d97e" containerName="init" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.411006 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78065e8-9570-4ebb-8bda-4e03a586d97e" containerName="init" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.418181 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78065e8-9570-4ebb-8bda-4e03a586d97e" containerName="init" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.419497 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.422333 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.422537 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.472258 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d7d56d58f-cswwm"] Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.485563 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-internal-tls-certs\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.485688 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-ovndb-tls-certs\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.485719 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-public-tls-certs\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.485799 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-combined-ca-bundle\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.485835 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-httpd-config\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.485867 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-config\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.485890 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-689st\" (UniqueName: \"kubernetes.io/projected/58fac2cb-4974-4241-8a11-77ad13d22306-kube-api-access-689st\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.590891 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-ovndb-tls-certs\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.591213 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-public-tls-certs\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.591246 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-combined-ca-bundle\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.591289 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-httpd-config\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.591325 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-config\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.591346 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-689st\" (UniqueName: \"kubernetes.io/projected/58fac2cb-4974-4241-8a11-77ad13d22306-kube-api-access-689st\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.591386 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-internal-tls-certs\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.599891 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-internal-tls-certs\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.609118 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-httpd-config\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.609277 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-config\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.616204 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-ovndb-tls-certs\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.627671 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-combined-ca-bundle\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.636962 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fac2cb-4974-4241-8a11-77ad13d22306-public-tls-certs\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.676861 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-689st\" (UniqueName: \"kubernetes.io/projected/58fac2cb-4974-4241-8a11-77ad13d22306-kube-api-access-689st\") pod \"neutron-d7d56d58f-cswwm\" (UID: \"58fac2cb-4974-4241-8a11-77ad13d22306\") " pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.751990 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.761163 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" event={"ID":"9cf408a6-c7e6-4bf3-80c6-47cc10bec465","Type":"ContainerStarted","Data":"31753fc38562d3b31dcf074ecdca62543ae7094e3815633d062d93fb7ac2e553"} Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.764048 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" event={"ID":"da5e13d2-6aca-4810-bb7e-882f73b4aa33","Type":"ContainerStarted","Data":"9cfa53e10033707a515a398a0393b9cc91c5a92d819782aaf54fdf1aaf583540"} Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.765115 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.778377 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8","Type":"ContainerStarted","Data":"2093127f95e374b6967df7e40de3db189d17633811478466ab45e5c4767dc7cf"} Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.794605 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" podStartSLOduration=4.794586174 podStartE2EDuration="4.794586174s" podCreationTimestamp="2025-10-03 14:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:16.793665241 +0000 UTC m=+1166.652391498" watchObservedRunningTime="2025-10-03 14:20:16.794586174 +0000 UTC m=+1166.653312421" Oct 03 14:20:16 crc kubenswrapper[4636]: I1003 14:20:16.877800 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7585984468-sqdbh" podStartSLOduration=3.877776276 podStartE2EDuration="3.877776276s" podCreationTimestamp="2025-10-03 14:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:16.862670053 +0000 UTC m=+1166.721396300" watchObservedRunningTime="2025-10-03 14:20:16.877776276 +0000 UTC m=+1166.736502523" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.112994 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.113197 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f76699687-g9k2g" event={"ID":"596e3078-e359-4e8d-a7c0-74c710f2c2f9","Type":"ContainerStarted","Data":"7f7ecbcc313fffbb48e0f19782cc494a70ad46ccc40626ff5df2b964b7811fae"} Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.113219 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7585984468-sqdbh" event={"ID":"3616dd77-ea16-43c1-9d40-592fb7226c95","Type":"ContainerStarted","Data":"2af1a8964148e3fedde957f30efd5c8a8020efad1dc7d9c14a07ecf8d0362e28"} Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.601216 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d7d56d58f-cswwm"] Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.626363 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-f548d674d-2q8gg"] Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.627954 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.633020 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.638040 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.645327 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f548d674d-2q8gg"] Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.655972 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-combined-ca-bundle\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.656044 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-config-data-custom\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.656093 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-public-tls-certs\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.656145 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnmqd\" (UniqueName: \"kubernetes.io/projected/ee5faec7-3829-49b6-aca7-452f5eae6a67-kube-api-access-vnmqd\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.656207 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee5faec7-3829-49b6-aca7-452f5eae6a67-logs\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.656244 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-internal-tls-certs\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.656280 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-config-data\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.692711 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7976d47688-kx5v5" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.692764 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.693313 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"6f203755d3b7d2412b112d16f7778187f0cdb274206e2b3e4aaeccb274cae768"} pod="openstack/horizon-7976d47688-kx5v5" containerMessage="Container horizon failed startup probe, will be restarted" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.693344 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7976d47688-kx5v5" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" containerID="cri-o://6f203755d3b7d2412b112d16f7778187f0cdb274206e2b3e4aaeccb274cae768" gracePeriod=30 Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.757614 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-config-data-custom\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.758765 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-public-tls-certs\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.758873 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnmqd\" (UniqueName: \"kubernetes.io/projected/ee5faec7-3829-49b6-aca7-452f5eae6a67-kube-api-access-vnmqd\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.759034 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee5faec7-3829-49b6-aca7-452f5eae6a67-logs\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.759191 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-internal-tls-certs\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.759271 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-config-data\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.759380 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-combined-ca-bundle\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.762218 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee5faec7-3829-49b6-aca7-452f5eae6a67-logs\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.773478 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-internal-tls-certs\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.774113 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-combined-ca-bundle\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.775763 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-public-tls-certs\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.776796 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-config-data\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.785065 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee5faec7-3829-49b6-aca7-452f5eae6a67-config-data-custom\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.796964 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnmqd\" (UniqueName: \"kubernetes.io/projected/ee5faec7-3829-49b6-aca7-452f5eae6a67-kube-api-access-vnmqd\") pod \"barbican-api-f548d674d-2q8gg\" (UID: \"ee5faec7-3829-49b6-aca7-452f5eae6a67\") " pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.859791 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f76699687-g9k2g" event={"ID":"596e3078-e359-4e8d-a7c0-74c710f2c2f9","Type":"ContainerStarted","Data":"91ceed804c245db398c88b53ccefd6dce932cf4c123c7fe22f5a0ac5360971a6"} Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.871350 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" event={"ID":"9cf408a6-c7e6-4bf3-80c6-47cc10bec465","Type":"ContainerStarted","Data":"db7ba155d3c35247f0dea23d357030e310c7bec411affcfb2c73f1d59cd3f1f9"} Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.880244 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d7d56d58f-cswwm" event={"ID":"58fac2cb-4974-4241-8a11-77ad13d22306","Type":"ContainerStarted","Data":"1253b2c592a17ceb73100c1d7c1c6ad76688cab056610609a2adac393db13736"} Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.907015 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6f76699687-g9k2g" podStartSLOduration=3.470549966 podStartE2EDuration="7.906993652s" podCreationTimestamp="2025-10-03 14:20:10 +0000 UTC" firstStartedPulling="2025-10-03 14:20:11.632875278 +0000 UTC m=+1161.491601525" lastFinishedPulling="2025-10-03 14:20:16.069318974 +0000 UTC m=+1165.928045211" observedRunningTime="2025-10-03 14:20:17.888786499 +0000 UTC m=+1167.747512746" watchObservedRunningTime="2025-10-03 14:20:17.906993652 +0000 UTC m=+1167.765719899" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.918610 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7f6878bdf6-2vf98" podStartSLOduration=3.55193592 podStartE2EDuration="7.918585953s" podCreationTimestamp="2025-10-03 14:20:10 +0000 UTC" firstStartedPulling="2025-10-03 14:20:11.705246108 +0000 UTC m=+1161.563972355" lastFinishedPulling="2025-10-03 14:20:16.071896141 +0000 UTC m=+1165.930622388" observedRunningTime="2025-10-03 14:20:17.914939328 +0000 UTC m=+1167.773665575" watchObservedRunningTime="2025-10-03 14:20:17.918585953 +0000 UTC m=+1167.777312200" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.973188 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.973938 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8c5bc9456-rfvns" podUID="0025da7c-17f3-4036-a9fc-3330508c11cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.974086 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.982029 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"79ec2b52d512bd5fdecc8099ed615849a5e48db694fa62ea57cc7a7daeb17a1c"} pod="openstack/horizon-8c5bc9456-rfvns" containerMessage="Container horizon failed startup probe, will be restarted" Oct 03 14:20:17 crc kubenswrapper[4636]: I1003 14:20:17.982085 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8c5bc9456-rfvns" podUID="0025da7c-17f3-4036-a9fc-3330508c11cd" containerName="horizon" containerID="cri-o://79ec2b52d512bd5fdecc8099ed615849a5e48db694fa62ea57cc7a7daeb17a1c" gracePeriod=30 Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.319488 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f548d674d-2q8gg"] Oct 03 14:20:18 crc kubenswrapper[4636]: W1003 14:20:18.346531 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee5faec7_3829_49b6_aca7_452f5eae6a67.slice/crio-d3aa2d16e9cfcc0e5a404e2027e81534e5d3d562a975ed43bb087badafc11692 WatchSource:0}: Error finding container d3aa2d16e9cfcc0e5a404e2027e81534e5d3d562a975ed43bb087badafc11692: Status 404 returned error can't find the container with id d3aa2d16e9cfcc0e5a404e2027e81534e5d3d562a975ed43bb087badafc11692 Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.891164 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8","Type":"ContainerStarted","Data":"65cb10ffdac695f678230c7ebcb1b6f1734bebb98e527bc2d15b9cd26d0a1acb"} Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.896056 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f548d674d-2q8gg" event={"ID":"ee5faec7-3829-49b6-aca7-452f5eae6a67","Type":"ContainerStarted","Data":"e26f555040c347486788a452d9ebee7fbebd04255a1cea1c9c87ef6f19e48c3c"} Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.896185 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f548d674d-2q8gg" event={"ID":"ee5faec7-3829-49b6-aca7-452f5eae6a67","Type":"ContainerStarted","Data":"56b4ec4758871cc1a8cefd41d18d60e0d44906323838fc6b44ed40a63b7ef396"} Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.896207 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f548d674d-2q8gg" event={"ID":"ee5faec7-3829-49b6-aca7-452f5eae6a67","Type":"ContainerStarted","Data":"d3aa2d16e9cfcc0e5a404e2027e81534e5d3d562a975ed43bb087badafc11692"} Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.896253 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.896271 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.925871 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d7d56d58f-cswwm" event={"ID":"58fac2cb-4974-4241-8a11-77ad13d22306","Type":"ContainerStarted","Data":"4aeff17370e19a997d42bcfe3036fc12f0b49e0c2f6d7cc39ae8ce805964a7bd"} Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.925910 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d7d56d58f-cswwm" event={"ID":"58fac2cb-4974-4241-8a11-77ad13d22306","Type":"ContainerStarted","Data":"bc310e4b84aaf2ed9d62e6fa547b287253f077f7e7a52911d5a23b3d00ea3649"} Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.925923 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.934559 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-f548d674d-2q8gg" podStartSLOduration=1.934542935 podStartE2EDuration="1.934542935s" podCreationTimestamp="2025-10-03 14:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:18.931542927 +0000 UTC m=+1168.790269174" watchObservedRunningTime="2025-10-03 14:20:18.934542935 +0000 UTC m=+1168.793269182" Oct 03 14:20:18 crc kubenswrapper[4636]: I1003 14:20:18.959555 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d7d56d58f-cswwm" podStartSLOduration=2.9595408450000003 podStartE2EDuration="2.959540845s" podCreationTimestamp="2025-10-03 14:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:18.959215426 +0000 UTC m=+1168.817941673" watchObservedRunningTime="2025-10-03 14:20:18.959540845 +0000 UTC m=+1168.818267092" Oct 03 14:20:19 crc kubenswrapper[4636]: I1003 14:20:19.939675 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8","Type":"ContainerStarted","Data":"a6ad748b87c9a3535ceb9f586f48083f2d1bd233d5773e1951e9e31ad55886c1"} Oct 03 14:20:19 crc kubenswrapper[4636]: I1003 14:20:19.940300 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 14:20:19 crc kubenswrapper[4636]: I1003 14:20:19.963005 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.824040381 podStartE2EDuration="9.962982881s" podCreationTimestamp="2025-10-03 14:20:10 +0000 UTC" firstStartedPulling="2025-10-03 14:20:12.284860874 +0000 UTC m=+1162.143587121" lastFinishedPulling="2025-10-03 14:20:19.423803374 +0000 UTC m=+1169.282529621" observedRunningTime="2025-10-03 14:20:19.958610697 +0000 UTC m=+1169.817336944" watchObservedRunningTime="2025-10-03 14:20:19.962982881 +0000 UTC m=+1169.821709128" Oct 03 14:20:20 crc kubenswrapper[4636]: I1003 14:20:20.950014 4636 generic.go:334] "Generic (PLEG): container finished" podID="48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" containerID="18a48e0b0583ccc1d96ce521f6e96af8c5680842f0f276483d011fab120d4498" exitCode=0 Oct 03 14:20:20 crc kubenswrapper[4636]: I1003 14:20:20.950059 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zgg4h" event={"ID":"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49","Type":"ContainerDied","Data":"18a48e0b0583ccc1d96ce521f6e96af8c5680842f0f276483d011fab120d4498"} Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.402834 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.540748 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-etc-machine-id\") pod \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.540821 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96vlm\" (UniqueName: \"kubernetes.io/projected/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-kube-api-access-96vlm\") pod \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.540869 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-config-data\") pod \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.540971 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-db-sync-config-data\") pod \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.541030 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-combined-ca-bundle\") pod \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.541135 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-scripts\") pod \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\" (UID: \"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49\") " Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.541329 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" (UID: "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.544610 4636 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.553539 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-scripts" (OuterVolumeSpecName: "scripts") pod "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" (UID: "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.578738 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-kube-api-access-96vlm" (OuterVolumeSpecName: "kube-api-access-96vlm") pod "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" (UID: "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49"). InnerVolumeSpecName "kube-api-access-96vlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.585238 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" (UID: "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.629610 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" (UID: "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.657921 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96vlm\" (UniqueName: \"kubernetes.io/projected/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-kube-api-access-96vlm\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.657961 4636 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.657977 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.657991 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.706236 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-config-data" (OuterVolumeSpecName: "config-data") pod "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" (UID: "48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.759666 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.980649 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zgg4h" event={"ID":"48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49","Type":"ContainerDied","Data":"73ac79edc3ecc3c312381d76c522466fc40b245e8886c7a1c65811884a02913f"} Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.980688 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ac79edc3ecc3c312381d76c522466fc40b245e8886c7a1c65811884a02913f" Oct 03 14:20:22 crc kubenswrapper[4636]: I1003 14:20:22.980766 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zgg4h" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.288530 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.293828 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:20:23 crc kubenswrapper[4636]: E1003 14:20:23.321016 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" containerName="cinder-db-sync" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.321051 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" containerName="cinder-db-sync" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.321456 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" containerName="cinder-db-sync" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.323055 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.324196 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.327987 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kxvg7" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.330349 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.335872 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.335970 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.476888 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.476959 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.477021 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzgg7\" (UniqueName: \"kubernetes.io/projected/129b89f7-da91-4bcd-8b05-a1d7f669f513-kube-api-access-lzgg7\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.477049 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-scripts\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.477212 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/129b89f7-da91-4bcd-8b05-a1d7f669f513-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.477247 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.482497 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-plqdl"] Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.482726 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" podUID="68364772-0202-4eb2-9789-c71d42592c48" containerName="dnsmasq-dns" containerID="cri-o://d495228831ff9b98b3dc64192e48a132b4753e094f0484c2f0acaca216cecf9a" gracePeriod=10 Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.556288 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-n62p4"] Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.557632 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.579244 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.579538 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.579667 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.579799 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzgg7\" (UniqueName: \"kubernetes.io/projected/129b89f7-da91-4bcd-8b05-a1d7f669f513-kube-api-access-lzgg7\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.579923 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-scripts\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.580059 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/129b89f7-da91-4bcd-8b05-a1d7f669f513-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.582246 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-n62p4"] Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.582310 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/129b89f7-da91-4bcd-8b05-a1d7f669f513-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.589155 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.613536 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.621062 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-scripts\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.635781 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.653924 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzgg7\" (UniqueName: \"kubernetes.io/projected/129b89f7-da91-4bcd-8b05-a1d7f669f513-kube-api-access-lzgg7\") pod \"cinder-scheduler-0\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.655533 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.685081 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-config\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.685184 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.685223 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.685256 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsvzc\" (UniqueName: \"kubernetes.io/projected/04a66e39-5c5f-4473-a1af-376694a4f2cf-kube-api-access-lsvzc\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.685276 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.685299 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.782200 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.783599 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.787415 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-config\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.787498 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.787661 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.787796 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsvzc\" (UniqueName: \"kubernetes.io/projected/04a66e39-5c5f-4473-a1af-376694a4f2cf-kube-api-access-lsvzc\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.787867 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.787936 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.788477 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.788880 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.789127 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.789491 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-config\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.789682 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.789844 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.807146 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.836317 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsvzc\" (UniqueName: \"kubernetes.io/projected/04a66e39-5c5f-4473-a1af-376694a4f2cf-kube-api-access-lsvzc\") pod \"dnsmasq-dns-5c9776ccc5-n62p4\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.879515 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.889224 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efd4395-025e-4e34-9b79-d585b6eb32c7-logs\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.889308 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7efd4395-025e-4e34-9b79-d585b6eb32c7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.889358 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.889419 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.889501 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-scripts\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.889551 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data-custom\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.889639 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp5mm\" (UniqueName: \"kubernetes.io/projected/7efd4395-025e-4e34-9b79-d585b6eb32c7-kube-api-access-cp5mm\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.992248 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.992574 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.992626 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-scripts\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.992655 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data-custom\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.992710 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp5mm\" (UniqueName: \"kubernetes.io/projected/7efd4395-025e-4e34-9b79-d585b6eb32c7-kube-api-access-cp5mm\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.992753 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efd4395-025e-4e34-9b79-d585b6eb32c7-logs\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.992779 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7efd4395-025e-4e34-9b79-d585b6eb32c7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.992858 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7efd4395-025e-4e34-9b79-d585b6eb32c7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:23 crc kubenswrapper[4636]: I1003 14:20:23.999458 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efd4395-025e-4e34-9b79-d585b6eb32c7-logs\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.004030 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data-custom\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.004461 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-scripts\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.004523 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.014329 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.026282 4636 generic.go:334] "Generic (PLEG): container finished" podID="68364772-0202-4eb2-9789-c71d42592c48" containerID="d495228831ff9b98b3dc64192e48a132b4753e094f0484c2f0acaca216cecf9a" exitCode=0 Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.026336 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" event={"ID":"68364772-0202-4eb2-9789-c71d42592c48","Type":"ContainerDied","Data":"d495228831ff9b98b3dc64192e48a132b4753e094f0484c2f0acaca216cecf9a"} Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.028923 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp5mm\" (UniqueName: \"kubernetes.io/projected/7efd4395-025e-4e34-9b79-d585b6eb32c7-kube-api-access-cp5mm\") pod \"cinder-api-0\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " pod="openstack/cinder-api-0" Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.128600 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.532408 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:20:24 crc kubenswrapper[4636]: W1003 14:20:24.555209 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod129b89f7_da91_4bcd_8b05_a1d7f669f513.slice/crio-e5d775ad76db3368246eebbf2802dd2d43ebd1d1126f567f48da036e23f0a8b7 WatchSource:0}: Error finding container e5d775ad76db3368246eebbf2802dd2d43ebd1d1126f567f48da036e23f0a8b7: Status 404 returned error can't find the container with id e5d775ad76db3368246eebbf2802dd2d43ebd1d1126f567f48da036e23f0a8b7 Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.680322 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.829692 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-svc\") pod \"68364772-0202-4eb2-9789-c71d42592c48\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.829761 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-swift-storage-0\") pod \"68364772-0202-4eb2-9789-c71d42592c48\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.829821 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-nb\") pod \"68364772-0202-4eb2-9789-c71d42592c48\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.829840 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-config\") pod \"68364772-0202-4eb2-9789-c71d42592c48\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.829864 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-sb\") pod \"68364772-0202-4eb2-9789-c71d42592c48\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.829908 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29hxx\" (UniqueName: \"kubernetes.io/projected/68364772-0202-4eb2-9789-c71d42592c48-kube-api-access-29hxx\") pod \"68364772-0202-4eb2-9789-c71d42592c48\" (UID: \"68364772-0202-4eb2-9789-c71d42592c48\") " Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.845214 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68364772-0202-4eb2-9789-c71d42592c48-kube-api-access-29hxx" (OuterVolumeSpecName: "kube-api-access-29hxx") pod "68364772-0202-4eb2-9789-c71d42592c48" (UID: "68364772-0202-4eb2-9789-c71d42592c48"). InnerVolumeSpecName "kube-api-access-29hxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.869837 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-n62p4"] Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.933913 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29hxx\" (UniqueName: \"kubernetes.io/projected/68364772-0202-4eb2-9789-c71d42592c48-kube-api-access-29hxx\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:24 crc kubenswrapper[4636]: I1003 14:20:24.980953 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68364772-0202-4eb2-9789-c71d42592c48" (UID: "68364772-0202-4eb2-9789-c71d42592c48"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.000655 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "68364772-0202-4eb2-9789-c71d42592c48" (UID: "68364772-0202-4eb2-9789-c71d42592c48"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.031642 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-config" (OuterVolumeSpecName: "config") pod "68364772-0202-4eb2-9789-c71d42592c48" (UID: "68364772-0202-4eb2-9789-c71d42592c48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.043130 4636 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.043155 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.043166 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.059451 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" event={"ID":"68364772-0202-4eb2-9789-c71d42592c48","Type":"ContainerDied","Data":"25e794b20be9e591c985f28b577c8801090035fd6a79a98a809b03e566b08e50"} Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.059500 4636 scope.go:117] "RemoveContainer" containerID="d495228831ff9b98b3dc64192e48a132b4753e094f0484c2f0acaca216cecf9a" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.060024 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-plqdl" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.066954 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" event={"ID":"04a66e39-5c5f-4473-a1af-376694a4f2cf","Type":"ContainerStarted","Data":"b48a169572087d6331f71f50f50a58b40f74ac613086f0855400aea5de0ba07f"} Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.077317 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"129b89f7-da91-4bcd-8b05-a1d7f669f513","Type":"ContainerStarted","Data":"e5d775ad76db3368246eebbf2802dd2d43ebd1d1126f567f48da036e23f0a8b7"} Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.107801 4636 scope.go:117] "RemoveContainer" containerID="73c23f8270bd49ca9b9bb25c9757f812b2da434f1c211827c2b1a7100eed943e" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.108531 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68364772-0202-4eb2-9789-c71d42592c48" (UID: "68364772-0202-4eb2-9789-c71d42592c48"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.109030 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68364772-0202-4eb2-9789-c71d42592c48" (UID: "68364772-0202-4eb2-9789-c71d42592c48"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.160260 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.160299 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68364772-0202-4eb2-9789-c71d42592c48-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.242334 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.533423 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-plqdl"] Oct 03 14:20:25 crc kubenswrapper[4636]: I1003 14:20:25.557634 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-plqdl"] Oct 03 14:20:26 crc kubenswrapper[4636]: I1003 14:20:26.121151 4636 generic.go:334] "Generic (PLEG): container finished" podID="04a66e39-5c5f-4473-a1af-376694a4f2cf" containerID="42d59f20d9d3112092c15c388830898cb7bc8adfeeaa81215f6ab200b08a5d63" exitCode=0 Oct 03 14:20:26 crc kubenswrapper[4636]: I1003 14:20:26.121224 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" event={"ID":"04a66e39-5c5f-4473-a1af-376694a4f2cf","Type":"ContainerDied","Data":"42d59f20d9d3112092c15c388830898cb7bc8adfeeaa81215f6ab200b08a5d63"} Oct 03 14:20:26 crc kubenswrapper[4636]: I1003 14:20:26.152237 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7efd4395-025e-4e34-9b79-d585b6eb32c7","Type":"ContainerStarted","Data":"c08a7931603b8b2643f88abd061a8475377cee576c5e19b35895b0504e8daf89"} Oct 03 14:20:26 crc kubenswrapper[4636]: I1003 14:20:26.477310 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57b58d64fd-vdxfd" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:20:26 crc kubenswrapper[4636]: I1003 14:20:26.477672 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57b58d64fd-vdxfd" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:20:26 crc kubenswrapper[4636]: I1003 14:20:26.817976 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68364772-0202-4eb2-9789-c71d42592c48" path="/var/lib/kubelet/pods/68364772-0202-4eb2-9789-c71d42592c48/volumes" Oct 03 14:20:27 crc kubenswrapper[4636]: I1003 14:20:27.232980 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" event={"ID":"04a66e39-5c5f-4473-a1af-376694a4f2cf","Type":"ContainerStarted","Data":"98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4"} Oct 03 14:20:27 crc kubenswrapper[4636]: I1003 14:20:27.234377 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:27 crc kubenswrapper[4636]: I1003 14:20:27.281148 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" podStartSLOduration=4.281132936 podStartE2EDuration="4.281132936s" podCreationTimestamp="2025-10-03 14:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:27.279764801 +0000 UTC m=+1177.138491058" watchObservedRunningTime="2025-10-03 14:20:27.281132936 +0000 UTC m=+1177.139859183" Oct 03 14:20:27 crc kubenswrapper[4636]: I1003 14:20:27.286620 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"129b89f7-da91-4bcd-8b05-a1d7f669f513","Type":"ContainerStarted","Data":"e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010"} Oct 03 14:20:27 crc kubenswrapper[4636]: I1003 14:20:27.298631 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7efd4395-025e-4e34-9b79-d585b6eb32c7","Type":"ContainerStarted","Data":"628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5"} Oct 03 14:20:28 crc kubenswrapper[4636]: I1003 14:20:28.310682 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7efd4395-025e-4e34-9b79-d585b6eb32c7","Type":"ContainerStarted","Data":"b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a"} Oct 03 14:20:28 crc kubenswrapper[4636]: I1003 14:20:28.340169 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:20:28 crc kubenswrapper[4636]: I1003 14:20:28.370116 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.370085195 podStartE2EDuration="5.370085195s" podCreationTimestamp="2025-10-03 14:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:28.366860841 +0000 UTC m=+1178.225587098" watchObservedRunningTime="2025-10-03 14:20:28.370085195 +0000 UTC m=+1178.228811442" Oct 03 14:20:29 crc kubenswrapper[4636]: I1003 14:20:29.006635 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-f548d674d-2q8gg" podUID="ee5faec7-3829-49b6-aca7-452f5eae6a67" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:20:29 crc kubenswrapper[4636]: I1003 14:20:29.006833 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-f548d674d-2q8gg" podUID="ee5faec7-3829-49b6-aca7-452f5eae6a67" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:20:29 crc kubenswrapper[4636]: I1003 14:20:29.129433 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 14:20:29 crc kubenswrapper[4636]: I1003 14:20:29.322244 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"129b89f7-da91-4bcd-8b05-a1d7f669f513","Type":"ContainerStarted","Data":"f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc"} Oct 03 14:20:29 crc kubenswrapper[4636]: I1003 14:20:29.346026 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.957935998 podStartE2EDuration="6.346004737s" podCreationTimestamp="2025-10-03 14:20:23 +0000 UTC" firstStartedPulling="2025-10-03 14:20:24.57353234 +0000 UTC m=+1174.432258587" lastFinishedPulling="2025-10-03 14:20:25.961601079 +0000 UTC m=+1175.820327326" observedRunningTime="2025-10-03 14:20:29.339282322 +0000 UTC m=+1179.198008569" watchObservedRunningTime="2025-10-03 14:20:29.346004737 +0000 UTC m=+1179.204730974" Oct 03 14:20:29 crc kubenswrapper[4636]: I1003 14:20:29.873439 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-57b58d64fd-vdxfd" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:20:29 crc kubenswrapper[4636]: I1003 14:20:29.995337 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-57b58d64fd-vdxfd" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:20:30 crc kubenswrapper[4636]: I1003 14:20:30.331638 4636 generic.go:334] "Generic (PLEG): container finished" podID="7efd4395-025e-4e34-9b79-d585b6eb32c7" containerID="b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a" exitCode=1 Oct 03 14:20:30 crc kubenswrapper[4636]: I1003 14:20:30.331704 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7efd4395-025e-4e34-9b79-d585b6eb32c7","Type":"ContainerDied","Data":"b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a"} Oct 03 14:20:30 crc kubenswrapper[4636]: I1003 14:20:30.331838 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7efd4395-025e-4e34-9b79-d585b6eb32c7" containerName="cinder-api-log" containerID="cri-o://628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5" gracePeriod=30 Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.225467 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.351629 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-scripts\") pod \"7efd4395-025e-4e34-9b79-d585b6eb32c7\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.351674 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7efd4395-025e-4e34-9b79-d585b6eb32c7-etc-machine-id\") pod \"7efd4395-025e-4e34-9b79-d585b6eb32c7\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.351783 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-combined-ca-bundle\") pod \"7efd4395-025e-4e34-9b79-d585b6eb32c7\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.351859 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data\") pod \"7efd4395-025e-4e34-9b79-d585b6eb32c7\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.351892 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data-custom\") pod \"7efd4395-025e-4e34-9b79-d585b6eb32c7\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.351936 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efd4395-025e-4e34-9b79-d585b6eb32c7-logs\") pod \"7efd4395-025e-4e34-9b79-d585b6eb32c7\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.351990 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp5mm\" (UniqueName: \"kubernetes.io/projected/7efd4395-025e-4e34-9b79-d585b6eb32c7-kube-api-access-cp5mm\") pod \"7efd4395-025e-4e34-9b79-d585b6eb32c7\" (UID: \"7efd4395-025e-4e34-9b79-d585b6eb32c7\") " Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.352784 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7efd4395-025e-4e34-9b79-d585b6eb32c7-logs" (OuterVolumeSpecName: "logs") pod "7efd4395-025e-4e34-9b79-d585b6eb32c7" (UID: "7efd4395-025e-4e34-9b79-d585b6eb32c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.352962 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7efd4395-025e-4e34-9b79-d585b6eb32c7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7efd4395-025e-4e34-9b79-d585b6eb32c7" (UID: "7efd4395-025e-4e34-9b79-d585b6eb32c7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.358050 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7efd4395-025e-4e34-9b79-d585b6eb32c7-kube-api-access-cp5mm" (OuterVolumeSpecName: "kube-api-access-cp5mm") pod "7efd4395-025e-4e34-9b79-d585b6eb32c7" (UID: "7efd4395-025e-4e34-9b79-d585b6eb32c7"). InnerVolumeSpecName "kube-api-access-cp5mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.389189 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7efd4395-025e-4e34-9b79-d585b6eb32c7" (UID: "7efd4395-025e-4e34-9b79-d585b6eb32c7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.392226 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-scripts" (OuterVolumeSpecName: "scripts") pod "7efd4395-025e-4e34-9b79-d585b6eb32c7" (UID: "7efd4395-025e-4e34-9b79-d585b6eb32c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.409202 4636 generic.go:334] "Generic (PLEG): container finished" podID="7efd4395-025e-4e34-9b79-d585b6eb32c7" containerID="628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5" exitCode=143 Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.409240 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7efd4395-025e-4e34-9b79-d585b6eb32c7","Type":"ContainerDied","Data":"628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5"} Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.409266 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7efd4395-025e-4e34-9b79-d585b6eb32c7","Type":"ContainerDied","Data":"c08a7931603b8b2643f88abd061a8475377cee576c5e19b35895b0504e8daf89"} Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.409293 4636 scope.go:117] "RemoveContainer" containerID="b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.409486 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.420953 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7efd4395-025e-4e34-9b79-d585b6eb32c7" (UID: "7efd4395-025e-4e34-9b79-d585b6eb32c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.453587 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.453618 4636 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.453628 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efd4395-025e-4e34-9b79-d585b6eb32c7-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.453638 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp5mm\" (UniqueName: \"kubernetes.io/projected/7efd4395-025e-4e34-9b79-d585b6eb32c7-kube-api-access-cp5mm\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.453650 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.453659 4636 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7efd4395-025e-4e34-9b79-d585b6eb32c7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.519363 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57b58d64fd-vdxfd" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.555346 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data" (OuterVolumeSpecName: "config-data") pod "7efd4395-025e-4e34-9b79-d585b6eb32c7" (UID: "7efd4395-025e-4e34-9b79-d585b6eb32c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.557200 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efd4395-025e-4e34-9b79-d585b6eb32c7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.561285 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57b58d64fd-vdxfd" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.562355 4636 scope.go:117] "RemoveContainer" containerID="628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.589849 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.608764 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.652243 4636 scope.go:117] "RemoveContainer" containerID="b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a" Oct 03 14:20:31 crc kubenswrapper[4636]: E1003 14:20:31.652794 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a\": container with ID starting with b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a not found: ID does not exist" containerID="b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.652828 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a"} err="failed to get container status \"b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a\": rpc error: code = NotFound desc = could not find container \"b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a\": container with ID starting with b6d0e670eb2287da59090942eea846417a41f9a9977650ca9f0d1edcd628757a not found: ID does not exist" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.652851 4636 scope.go:117] "RemoveContainer" containerID="628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5" Oct 03 14:20:31 crc kubenswrapper[4636]: E1003 14:20:31.653213 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5\": container with ID starting with 628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5 not found: ID does not exist" containerID="628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.653235 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5"} err="failed to get container status \"628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5\": rpc error: code = NotFound desc = could not find container \"628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5\": container with ID starting with 628cdcbfdf292d6bbca455a1f421b5817da55424e2d3382d5a7104989c37dcf5 not found: ID does not exist" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.755054 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.761810 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.797054 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:20:31 crc kubenswrapper[4636]: E1003 14:20:31.797936 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7efd4395-025e-4e34-9b79-d585b6eb32c7" containerName="cinder-api" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.797949 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="7efd4395-025e-4e34-9b79-d585b6eb32c7" containerName="cinder-api" Oct 03 14:20:31 crc kubenswrapper[4636]: E1003 14:20:31.797965 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68364772-0202-4eb2-9789-c71d42592c48" containerName="init" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.797972 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="68364772-0202-4eb2-9789-c71d42592c48" containerName="init" Oct 03 14:20:31 crc kubenswrapper[4636]: E1003 14:20:31.797988 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7efd4395-025e-4e34-9b79-d585b6eb32c7" containerName="cinder-api-log" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.797995 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="7efd4395-025e-4e34-9b79-d585b6eb32c7" containerName="cinder-api-log" Oct 03 14:20:31 crc kubenswrapper[4636]: E1003 14:20:31.798029 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68364772-0202-4eb2-9789-c71d42592c48" containerName="dnsmasq-dns" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.798038 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="68364772-0202-4eb2-9789-c71d42592c48" containerName="dnsmasq-dns" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.798489 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="7efd4395-025e-4e34-9b79-d585b6eb32c7" containerName="cinder-api-log" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.798523 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="68364772-0202-4eb2-9789-c71d42592c48" containerName="dnsmasq-dns" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.798537 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="7efd4395-025e-4e34-9b79-d585b6eb32c7" containerName="cinder-api" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.800498 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.807366 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.808021 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.812574 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.848741 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.862672 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.863605 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-scripts\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.863655 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrx9c\" (UniqueName: \"kubernetes.io/projected/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-kube-api-access-zrx9c\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.863700 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-logs\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.863731 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.863760 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-config-data\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.863835 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.863856 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.863894 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.966740 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.966787 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.966830 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.966914 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.966951 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-scripts\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.966987 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrx9c\" (UniqueName: \"kubernetes.io/projected/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-kube-api-access-zrx9c\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.967026 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-logs\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.967054 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.967139 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-config-data\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.972243 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-logs\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.972243 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.978060 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-config-data\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.980677 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.984197 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:31 crc kubenswrapper[4636]: I1003 14:20:31.990109 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-scripts\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:32 crc kubenswrapper[4636]: I1003 14:20:32.002155 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:32 crc kubenswrapper[4636]: I1003 14:20:32.005244 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrx9c\" (UniqueName: \"kubernetes.io/projected/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-kube-api-access-zrx9c\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:32 crc kubenswrapper[4636]: I1003 14:20:32.020371 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3dbfb9-f2b2-4725-9960-07d3fb89125e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a3dbfb9-f2b2-4725-9960-07d3fb89125e\") " pod="openstack/cinder-api-0" Oct 03 14:20:32 crc kubenswrapper[4636]: I1003 14:20:32.164223 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 14:20:32 crc kubenswrapper[4636]: W1003 14:20:32.809656 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a3dbfb9_f2b2_4725_9960_07d3fb89125e.slice/crio-d3d861e568511c4dfa0cfafc99c58595d65532190b25a568ba92c7e9fb966f2d WatchSource:0}: Error finding container d3d861e568511c4dfa0cfafc99c58595d65532190b25a568ba92c7e9fb966f2d: Status 404 returned error can't find the container with id d3d861e568511c4dfa0cfafc99c58595d65532190b25a568ba92c7e9fb966f2d Oct 03 14:20:32 crc kubenswrapper[4636]: I1003 14:20:32.810582 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7efd4395-025e-4e34-9b79-d585b6eb32c7" path="/var/lib/kubelet/pods/7efd4395-025e-4e34-9b79-d585b6eb32c7/volumes" Oct 03 14:20:32 crc kubenswrapper[4636]: I1003 14:20:32.829305 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 14:20:33 crc kubenswrapper[4636]: I1003 14:20:33.019310 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-f548d674d-2q8gg" podUID="ee5faec7-3829-49b6-aca7-452f5eae6a67" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:20:33 crc kubenswrapper[4636]: I1003 14:20:33.019373 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-f548d674d-2q8gg" podUID="ee5faec7-3829-49b6-aca7-452f5eae6a67" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:20:33 crc kubenswrapper[4636]: I1003 14:20:33.442675 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a3dbfb9-f2b2-4725-9960-07d3fb89125e","Type":"ContainerStarted","Data":"d3d861e568511c4dfa0cfafc99c58595d65532190b25a568ba92c7e9fb966f2d"} Oct 03 14:20:33 crc kubenswrapper[4636]: I1003 14:20:33.657456 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 14:20:33 crc kubenswrapper[4636]: I1003 14:20:33.660341 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="129b89f7-da91-4bcd-8b05-a1d7f669f513" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.165:8080/\": dial tcp 10.217.0.165:8080: connect: connection refused" Oct 03 14:20:33 crc kubenswrapper[4636]: I1003 14:20:33.896557 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:20:33 crc kubenswrapper[4636]: I1003 14:20:33.976145 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-ddslh"] Oct 03 14:20:33 crc kubenswrapper[4636]: I1003 14:20:33.976373 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" podUID="da5e13d2-6aca-4810-bb7e-882f73b4aa33" containerName="dnsmasq-dns" containerID="cri-o://9cfa53e10033707a515a398a0393b9cc91c5a92d819782aaf54fdf1aaf583540" gracePeriod=10 Oct 03 14:20:34 crc kubenswrapper[4636]: I1003 14:20:34.024466 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-f548d674d-2q8gg" podUID="ee5faec7-3829-49b6-aca7-452f5eae6a67" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:20:34 crc kubenswrapper[4636]: I1003 14:20:34.494049 4636 generic.go:334] "Generic (PLEG): container finished" podID="da5e13d2-6aca-4810-bb7e-882f73b4aa33" containerID="9cfa53e10033707a515a398a0393b9cc91c5a92d819782aaf54fdf1aaf583540" exitCode=0 Oct 03 14:20:34 crc kubenswrapper[4636]: I1003 14:20:34.494150 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" event={"ID":"da5e13d2-6aca-4810-bb7e-882f73b4aa33","Type":"ContainerDied","Data":"9cfa53e10033707a515a398a0393b9cc91c5a92d819782aaf54fdf1aaf583540"} Oct 03 14:20:34 crc kubenswrapper[4636]: I1003 14:20:34.496549 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a3dbfb9-f2b2-4725-9960-07d3fb89125e","Type":"ContainerStarted","Data":"112fb96bff5e7cb288917b6f8a6c976b1cc99b213fd745c5f4a02cc143595802"} Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.022831 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.159592 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-swift-storage-0\") pod \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.159637 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-sb\") pod \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.159744 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-svc\") pod \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.159769 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-config\") pod \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.159829 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-nb\") pod \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.159860 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkzgx\" (UniqueName: \"kubernetes.io/projected/da5e13d2-6aca-4810-bb7e-882f73b4aa33-kube-api-access-hkzgx\") pod \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\" (UID: \"da5e13d2-6aca-4810-bb7e-882f73b4aa33\") " Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.167601 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da5e13d2-6aca-4810-bb7e-882f73b4aa33-kube-api-access-hkzgx" (OuterVolumeSpecName: "kube-api-access-hkzgx") pod "da5e13d2-6aca-4810-bb7e-882f73b4aa33" (UID: "da5e13d2-6aca-4810-bb7e-882f73b4aa33"). InnerVolumeSpecName "kube-api-access-hkzgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.261732 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkzgx\" (UniqueName: \"kubernetes.io/projected/da5e13d2-6aca-4810-bb7e-882f73b4aa33-kube-api-access-hkzgx\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.270179 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da5e13d2-6aca-4810-bb7e-882f73b4aa33" (UID: "da5e13d2-6aca-4810-bb7e-882f73b4aa33"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.272447 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da5e13d2-6aca-4810-bb7e-882f73b4aa33" (UID: "da5e13d2-6aca-4810-bb7e-882f73b4aa33"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.282014 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da5e13d2-6aca-4810-bb7e-882f73b4aa33" (UID: "da5e13d2-6aca-4810-bb7e-882f73b4aa33"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.305645 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da5e13d2-6aca-4810-bb7e-882f73b4aa33" (UID: "da5e13d2-6aca-4810-bb7e-882f73b4aa33"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.320669 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-config" (OuterVolumeSpecName: "config") pod "da5e13d2-6aca-4810-bb7e-882f73b4aa33" (UID: "da5e13d2-6aca-4810-bb7e-882f73b4aa33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.363361 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.363397 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.363409 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.363421 4636 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.363431 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5e13d2-6aca-4810-bb7e-882f73b4aa33-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.507786 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a3dbfb9-f2b2-4725-9960-07d3fb89125e","Type":"ContainerStarted","Data":"32bf09f920687d815280c3017270ea5c52fa8a112bc384dd835acea617243c65"} Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.507917 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.523245 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" event={"ID":"da5e13d2-6aca-4810-bb7e-882f73b4aa33","Type":"ContainerDied","Data":"493ec061031c8d47a56117dbbac1b15b23bc25551a8360b677f6322e0501e2f8"} Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.523318 4636 scope.go:117] "RemoveContainer" containerID="9cfa53e10033707a515a398a0393b9cc91c5a92d819782aaf54fdf1aaf583540" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.523539 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-ddslh" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.557756 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.55771495 podStartE2EDuration="4.55771495s" podCreationTimestamp="2025-10-03 14:20:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:35.549853906 +0000 UTC m=+1185.408580143" watchObservedRunningTime="2025-10-03 14:20:35.55771495 +0000 UTC m=+1185.416441197" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.559088 4636 scope.go:117] "RemoveContainer" containerID="383ff5430a786830efcfd2291ee62a10eeead8f9b01686dcdf2c5545c35e522a" Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.633162 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-ddslh"] Oct 03 14:20:35 crc kubenswrapper[4636]: I1003 14:20:35.637596 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-ddslh"] Oct 03 14:20:36 crc kubenswrapper[4636]: I1003 14:20:36.700386 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:36 crc kubenswrapper[4636]: I1003 14:20:36.804533 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da5e13d2-6aca-4810-bb7e-882f73b4aa33" path="/var/lib/kubelet/pods/da5e13d2-6aca-4810-bb7e-882f73b4aa33/volumes" Oct 03 14:20:36 crc kubenswrapper[4636]: I1003 14:20:36.861515 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7b4f64b6bf-z54p6" Oct 03 14:20:37 crc kubenswrapper[4636]: I1003 14:20:37.342977 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f548d674d-2q8gg" Oct 03 14:20:37 crc kubenswrapper[4636]: I1003 14:20:37.424615 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57b58d64fd-vdxfd"] Oct 03 14:20:37 crc kubenswrapper[4636]: I1003 14:20:37.424933 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57b58d64fd-vdxfd" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api-log" containerID="cri-o://8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61" gracePeriod=30 Oct 03 14:20:37 crc kubenswrapper[4636]: I1003 14:20:37.425333 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57b58d64fd-vdxfd" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api" containerID="cri-o://9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f" gracePeriod=30 Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.354254 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 14:20:38 crc kubenswrapper[4636]: E1003 14:20:38.354950 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5e13d2-6aca-4810-bb7e-882f73b4aa33" containerName="init" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.354965 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5e13d2-6aca-4810-bb7e-882f73b4aa33" containerName="init" Oct 03 14:20:38 crc kubenswrapper[4636]: E1003 14:20:38.354996 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5e13d2-6aca-4810-bb7e-882f73b4aa33" containerName="dnsmasq-dns" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.355002 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5e13d2-6aca-4810-bb7e-882f73b4aa33" containerName="dnsmasq-dns" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.355202 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5e13d2-6aca-4810-bb7e-882f73b4aa33" containerName="dnsmasq-dns" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.365822 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.369596 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.370148 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.370891 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.377934 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-nxr47" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.537187 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a7aa438-f4f0-4975-a0e8-1005b56f8957-openstack-config\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.537602 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7aa438-f4f0-4975-a0e8-1005b56f8957-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.537824 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a7aa438-f4f0-4975-a0e8-1005b56f8957-openstack-config-secret\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.538051 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9k2v\" (UniqueName: \"kubernetes.io/projected/0a7aa438-f4f0-4975-a0e8-1005b56f8957-kube-api-access-n9k2v\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.556361 4636 generic.go:334] "Generic (PLEG): container finished" podID="615f5c9e-d9ff-4193-9477-478118e04b99" containerID="8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61" exitCode=143 Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.556406 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58d64fd-vdxfd" event={"ID":"615f5c9e-d9ff-4193-9477-478118e04b99","Type":"ContainerDied","Data":"8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61"} Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.639937 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a7aa438-f4f0-4975-a0e8-1005b56f8957-openstack-config\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.640320 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7aa438-f4f0-4975-a0e8-1005b56f8957-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.640454 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a7aa438-f4f0-4975-a0e8-1005b56f8957-openstack-config-secret\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.640614 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9k2v\" (UniqueName: \"kubernetes.io/projected/0a7aa438-f4f0-4975-a0e8-1005b56f8957-kube-api-access-n9k2v\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.640825 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a7aa438-f4f0-4975-a0e8-1005b56f8957-openstack-config\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.648086 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7aa438-f4f0-4975-a0e8-1005b56f8957-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.648765 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a7aa438-f4f0-4975-a0e8-1005b56f8957-openstack-config-secret\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.680454 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9k2v\" (UniqueName: \"kubernetes.io/projected/0a7aa438-f4f0-4975-a0e8-1005b56f8957-kube-api-access-n9k2v\") pod \"openstackclient\" (UID: \"0a7aa438-f4f0-4975-a0e8-1005b56f8957\") " pod="openstack/openstackclient" Oct 03 14:20:38 crc kubenswrapper[4636]: I1003 14:20:38.734769 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.167244 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.167626 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.167673 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.168411 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d353a53ac9390ffae337e3feef5ea083eb94bb2a25b7898e4f341f0e42163eb"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.168493 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://1d353a53ac9390ffae337e3feef5ea083eb94bb2a25b7898e4f341f0e42163eb" gracePeriod=600 Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.359893 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.419036 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.532321 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.587489 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="1d353a53ac9390ffae337e3feef5ea083eb94bb2a25b7898e4f341f0e42163eb" exitCode=0 Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.587862 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"1d353a53ac9390ffae337e3feef5ea083eb94bb2a25b7898e4f341f0e42163eb"} Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.587899 4636 scope.go:117] "RemoveContainer" containerID="07c604f152aa39f3430c1f62789f7be96b3f5a7c96a65ed6157e5d00f0a88d5d" Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.598884 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="129b89f7-da91-4bcd-8b05-a1d7f669f513" containerName="cinder-scheduler" containerID="cri-o://e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010" gracePeriod=30 Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.599169 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0a7aa438-f4f0-4975-a0e8-1005b56f8957","Type":"ContainerStarted","Data":"854b9124e71278a11e9dc673a347ed83b3c10c252d5f95bcddb78483c65ac160"} Oct 03 14:20:39 crc kubenswrapper[4636]: I1003 14:20:39.599499 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="129b89f7-da91-4bcd-8b05-a1d7f669f513" containerName="probe" containerID="cri-o://f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc" gracePeriod=30 Oct 03 14:20:40 crc kubenswrapper[4636]: I1003 14:20:40.610553 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"3f35c195de607af5e2083a70ee704e67efe4c37e24910c615f6adb0ee1029e41"} Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.226182 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.335990 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.410117 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9s77\" (UniqueName: \"kubernetes.io/projected/615f5c9e-d9ff-4193-9477-478118e04b99-kube-api-access-l9s77\") pod \"615f5c9e-d9ff-4193-9477-478118e04b99\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.410188 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615f5c9e-d9ff-4193-9477-478118e04b99-logs\") pod \"615f5c9e-d9ff-4193-9477-478118e04b99\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.410404 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-combined-ca-bundle\") pod \"615f5c9e-d9ff-4193-9477-478118e04b99\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.412624 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data-custom\") pod \"615f5c9e-d9ff-4193-9477-478118e04b99\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.412744 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data\") pod \"615f5c9e-d9ff-4193-9477-478118e04b99\" (UID: \"615f5c9e-d9ff-4193-9477-478118e04b99\") " Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.426954 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/615f5c9e-d9ff-4193-9477-478118e04b99-logs" (OuterVolumeSpecName: "logs") pod "615f5c9e-d9ff-4193-9477-478118e04b99" (UID: "615f5c9e-d9ff-4193-9477-478118e04b99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.436584 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615f5c9e-d9ff-4193-9477-478118e04b99-kube-api-access-l9s77" (OuterVolumeSpecName: "kube-api-access-l9s77") pod "615f5c9e-d9ff-4193-9477-478118e04b99" (UID: "615f5c9e-d9ff-4193-9477-478118e04b99"). InnerVolumeSpecName "kube-api-access-l9s77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.443639 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "615f5c9e-d9ff-4193-9477-478118e04b99" (UID: "615f5c9e-d9ff-4193-9477-478118e04b99"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.501521 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data" (OuterVolumeSpecName: "config-data") pod "615f5c9e-d9ff-4193-9477-478118e04b99" (UID: "615f5c9e-d9ff-4193-9477-478118e04b99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.511817 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "615f5c9e-d9ff-4193-9477-478118e04b99" (UID: "615f5c9e-d9ff-4193-9477-478118e04b99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.525180 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/615f5c9e-d9ff-4193-9477-478118e04b99-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.525212 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.525222 4636 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.525229 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615f5c9e-d9ff-4193-9477-478118e04b99-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.525239 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9s77\" (UniqueName: \"kubernetes.io/projected/615f5c9e-d9ff-4193-9477-478118e04b99-kube-api-access-l9s77\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.633210 4636 generic.go:334] "Generic (PLEG): container finished" podID="615f5c9e-d9ff-4193-9477-478118e04b99" containerID="9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f" exitCode=0 Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.633436 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58d64fd-vdxfd" event={"ID":"615f5c9e-d9ff-4193-9477-478118e04b99","Type":"ContainerDied","Data":"9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f"} Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.634500 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58d64fd-vdxfd" event={"ID":"615f5c9e-d9ff-4193-9477-478118e04b99","Type":"ContainerDied","Data":"db74f7afbd8c8386b3ff93c4bb1db3057fade10a3aa767b9648c989888aca581"} Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.634619 4636 scope.go:117] "RemoveContainer" containerID="9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.633535 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57b58d64fd-vdxfd" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.645676 4636 generic.go:334] "Generic (PLEG): container finished" podID="129b89f7-da91-4bcd-8b05-a1d7f669f513" containerID="f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc" exitCode=0 Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.646452 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"129b89f7-da91-4bcd-8b05-a1d7f669f513","Type":"ContainerDied","Data":"f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc"} Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.754385 4636 scope.go:117] "RemoveContainer" containerID="8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.766768 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57b58d64fd-vdxfd"] Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.778511 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-57b58d64fd-vdxfd"] Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.784306 4636 scope.go:117] "RemoveContainer" containerID="9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f" Oct 03 14:20:41 crc kubenswrapper[4636]: E1003 14:20:41.784740 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f\": container with ID starting with 9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f not found: ID does not exist" containerID="9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.784770 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f"} err="failed to get container status \"9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f\": rpc error: code = NotFound desc = could not find container \"9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f\": container with ID starting with 9053ed3170c2e9b80831b009d920e7b532be6831d884146f5ded24dee31bce3f not found: ID does not exist" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.784788 4636 scope.go:117] "RemoveContainer" containerID="8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61" Oct 03 14:20:41 crc kubenswrapper[4636]: E1003 14:20:41.799367 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61\": container with ID starting with 8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61 not found: ID does not exist" containerID="8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61" Oct 03 14:20:41 crc kubenswrapper[4636]: I1003 14:20:41.799410 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61"} err="failed to get container status \"8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61\": rpc error: code = NotFound desc = could not find container \"8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61\": container with ID starting with 8b4b27ac630a3a086d9e7199fad3641a18345ffcc77aa1c88400047e35d13f61 not found: ID does not exist" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.007789 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.281030 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6796cf444-9xs6c" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.608949 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.655044 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/129b89f7-da91-4bcd-8b05-a1d7f669f513-etc-machine-id\") pod \"129b89f7-da91-4bcd-8b05-a1d7f669f513\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.655145 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data\") pod \"129b89f7-da91-4bcd-8b05-a1d7f669f513\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.655274 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzgg7\" (UniqueName: \"kubernetes.io/projected/129b89f7-da91-4bcd-8b05-a1d7f669f513-kube-api-access-lzgg7\") pod \"129b89f7-da91-4bcd-8b05-a1d7f669f513\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.655299 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-combined-ca-bundle\") pod \"129b89f7-da91-4bcd-8b05-a1d7f669f513\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.655411 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-scripts\") pod \"129b89f7-da91-4bcd-8b05-a1d7f669f513\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.655451 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data-custom\") pod \"129b89f7-da91-4bcd-8b05-a1d7f669f513\" (UID: \"129b89f7-da91-4bcd-8b05-a1d7f669f513\") " Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.664657 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/129b89f7-da91-4bcd-8b05-a1d7f669f513-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "129b89f7-da91-4bcd-8b05-a1d7f669f513" (UID: "129b89f7-da91-4bcd-8b05-a1d7f669f513"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.680619 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129b89f7-da91-4bcd-8b05-a1d7f669f513-kube-api-access-lzgg7" (OuterVolumeSpecName: "kube-api-access-lzgg7") pod "129b89f7-da91-4bcd-8b05-a1d7f669f513" (UID: "129b89f7-da91-4bcd-8b05-a1d7f669f513"). InnerVolumeSpecName "kube-api-access-lzgg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.681705 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "129b89f7-da91-4bcd-8b05-a1d7f669f513" (UID: "129b89f7-da91-4bcd-8b05-a1d7f669f513"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.689513 4636 generic.go:334] "Generic (PLEG): container finished" podID="129b89f7-da91-4bcd-8b05-a1d7f669f513" containerID="e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010" exitCode=0 Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.690498 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.690933 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"129b89f7-da91-4bcd-8b05-a1d7f669f513","Type":"ContainerDied","Data":"e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010"} Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.690960 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"129b89f7-da91-4bcd-8b05-a1d7f669f513","Type":"ContainerDied","Data":"e5d775ad76db3368246eebbf2802dd2d43ebd1d1126f567f48da036e23f0a8b7"} Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.690978 4636 scope.go:117] "RemoveContainer" containerID="f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.692957 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-scripts" (OuterVolumeSpecName: "scripts") pod "129b89f7-da91-4bcd-8b05-a1d7f669f513" (UID: "129b89f7-da91-4bcd-8b05-a1d7f669f513"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.790928 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.791354 4636 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.791423 4636 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/129b89f7-da91-4bcd-8b05-a1d7f669f513-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.791486 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzgg7\" (UniqueName: \"kubernetes.io/projected/129b89f7-da91-4bcd-8b05-a1d7f669f513-kube-api-access-lzgg7\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.801252 4636 scope.go:117] "RemoveContainer" containerID="e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.848024 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" path="/var/lib/kubelet/pods/615f5c9e-d9ff-4193-9477-478118e04b99/volumes" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.871236 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "129b89f7-da91-4bcd-8b05-a1d7f669f513" (UID: "129b89f7-da91-4bcd-8b05-a1d7f669f513"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.903303 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:42 crc kubenswrapper[4636]: I1003 14:20:42.913323 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data" (OuterVolumeSpecName: "config-data") pod "129b89f7-da91-4bcd-8b05-a1d7f669f513" (UID: "129b89f7-da91-4bcd-8b05-a1d7f669f513"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.006089 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129b89f7-da91-4bcd-8b05-a1d7f669f513-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.061170 4636 scope.go:117] "RemoveContainer" containerID="f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc" Oct 03 14:20:43 crc kubenswrapper[4636]: E1003 14:20:43.061742 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc\": container with ID starting with f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc not found: ID does not exist" containerID="f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.061768 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc"} err="failed to get container status \"f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc\": rpc error: code = NotFound desc = could not find container \"f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc\": container with ID starting with f776a34259df175fcbbff60217d0d220f0ef4e8d246e3ae1a5e18c96ac75d3dc not found: ID does not exist" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.061788 4636 scope.go:117] "RemoveContainer" containerID="e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010" Oct 03 14:20:43 crc kubenswrapper[4636]: E1003 14:20:43.062014 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010\": container with ID starting with e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010 not found: ID does not exist" containerID="e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.062030 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010"} err="failed to get container status \"e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010\": rpc error: code = NotFound desc = could not find container \"e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010\": container with ID starting with e1ce1984937464ee82d59b78359f630bdd453add7c0efc223471dc473f961010 not found: ID does not exist" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.089802 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.106594 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.124048 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:20:43 crc kubenswrapper[4636]: E1003 14:20:43.124417 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api-log" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.124432 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api-log" Oct 03 14:20:43 crc kubenswrapper[4636]: E1003 14:20:43.124454 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.124459 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api" Oct 03 14:20:43 crc kubenswrapper[4636]: E1003 14:20:43.124479 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129b89f7-da91-4bcd-8b05-a1d7f669f513" containerName="cinder-scheduler" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.124485 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="129b89f7-da91-4bcd-8b05-a1d7f669f513" containerName="cinder-scheduler" Oct 03 14:20:43 crc kubenswrapper[4636]: E1003 14:20:43.124496 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129b89f7-da91-4bcd-8b05-a1d7f669f513" containerName="probe" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.124501 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="129b89f7-da91-4bcd-8b05-a1d7f669f513" containerName="probe" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.124669 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api-log" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.124677 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="615f5c9e-d9ff-4193-9477-478118e04b99" containerName="barbican-api" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.124694 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="129b89f7-da91-4bcd-8b05-a1d7f669f513" containerName="cinder-scheduler" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.124706 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="129b89f7-da91-4bcd-8b05-a1d7f669f513" containerName="probe" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.125630 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.131593 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.145396 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.210124 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-scripts\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.210190 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94180bad-9d72-4d67-aefa-1fd7a9d886ac-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.210231 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.210386 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.210424 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgzbt\" (UniqueName: \"kubernetes.io/projected/94180bad-9d72-4d67-aefa-1fd7a9d886ac-kube-api-access-wgzbt\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.210455 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-config-data\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.312012 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-scripts\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.312067 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94180bad-9d72-4d67-aefa-1fd7a9d886ac-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.312113 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.312207 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.312245 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgzbt\" (UniqueName: \"kubernetes.io/projected/94180bad-9d72-4d67-aefa-1fd7a9d886ac-kube-api-access-wgzbt\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.312275 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-config-data\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.313234 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94180bad-9d72-4d67-aefa-1fd7a9d886ac-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.318262 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.321219 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-config-data\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.321382 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-scripts\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.321951 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94180bad-9d72-4d67-aefa-1fd7a9d886ac-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.335850 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgzbt\" (UniqueName: \"kubernetes.io/projected/94180bad-9d72-4d67-aefa-1fd7a9d886ac-kube-api-access-wgzbt\") pod \"cinder-scheduler-0\" (UID: \"94180bad-9d72-4d67-aefa-1fd7a9d886ac\") " pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.448047 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 14:20:43 crc kubenswrapper[4636]: I1003 14:20:43.650115 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:44 crc kubenswrapper[4636]: I1003 14:20:44.100070 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 14:20:44 crc kubenswrapper[4636]: W1003 14:20:44.126529 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94180bad_9d72_4d67_aefa_1fd7a9d886ac.slice/crio-3e30aa20fc1263b771393fe25f654973553377e9bd3b67948fe9b28b7f24764d WatchSource:0}: Error finding container 3e30aa20fc1263b771393fe25f654973553377e9bd3b67948fe9b28b7f24764d: Status 404 returned error can't find the container with id 3e30aa20fc1263b771393fe25f654973553377e9bd3b67948fe9b28b7f24764d Oct 03 14:20:44 crc kubenswrapper[4636]: I1003 14:20:44.730835 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"94180bad-9d72-4d67-aefa-1fd7a9d886ac","Type":"ContainerStarted","Data":"3e30aa20fc1263b771393fe25f654973553377e9bd3b67948fe9b28b7f24764d"} Oct 03 14:20:44 crc kubenswrapper[4636]: I1003 14:20:44.856534 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="129b89f7-da91-4bcd-8b05-a1d7f669f513" path="/var/lib/kubelet/pods/129b89f7-da91-4bcd-8b05-a1d7f669f513/volumes" Oct 03 14:20:45 crc kubenswrapper[4636]: I1003 14:20:45.748304 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"94180bad-9d72-4d67-aefa-1fd7a9d886ac","Type":"ContainerStarted","Data":"5cfc7e72c338c5839c4566d7b38e4627d41d6a200a84188d075bc18fb52d2691"} Oct 03 14:20:46 crc kubenswrapper[4636]: I1003 14:20:46.368359 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 14:20:46 crc kubenswrapper[4636]: I1003 14:20:46.769391 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"94180bad-9d72-4d67-aefa-1fd7a9d886ac","Type":"ContainerStarted","Data":"ff5697f2961f8452140e19a9ee7ddd5a6db7af65d85f5be99d64701fa0dd3519"} Oct 03 14:20:46 crc kubenswrapper[4636]: I1003 14:20:46.776457 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-d7d56d58f-cswwm" Oct 03 14:20:46 crc kubenswrapper[4636]: I1003 14:20:46.791707 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.791687498 podStartE2EDuration="3.791687498s" podCreationTimestamp="2025-10-03 14:20:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:46.789800549 +0000 UTC m=+1196.648526796" watchObservedRunningTime="2025-10-03 14:20:46.791687498 +0000 UTC m=+1196.650413735" Oct 03 14:20:46 crc kubenswrapper[4636]: I1003 14:20:46.864933 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7585984468-sqdbh"] Oct 03 14:20:46 crc kubenswrapper[4636]: I1003 14:20:46.869199 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7585984468-sqdbh" podUID="3616dd77-ea16-43c1-9d40-592fb7226c95" containerName="neutron-api" containerID="cri-o://07eaec71e05a6ff3da632c7ad94b5d9a035a32b09a6c3fa3c43ff26e3e26f127" gracePeriod=30 Oct 03 14:20:46 crc kubenswrapper[4636]: I1003 14:20:46.869463 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7585984468-sqdbh" podUID="3616dd77-ea16-43c1-9d40-592fb7226c95" containerName="neutron-httpd" containerID="cri-o://2af1a8964148e3fedde957f30efd5c8a8020efad1dc7d9c14a07ecf8d0362e28" gracePeriod=30 Oct 03 14:20:47 crc kubenswrapper[4636]: I1003 14:20:47.810062 4636 generic.go:334] "Generic (PLEG): container finished" podID="3616dd77-ea16-43c1-9d40-592fb7226c95" containerID="2af1a8964148e3fedde957f30efd5c8a8020efad1dc7d9c14a07ecf8d0362e28" exitCode=0 Oct 03 14:20:47 crc kubenswrapper[4636]: I1003 14:20:47.811226 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7585984468-sqdbh" event={"ID":"3616dd77-ea16-43c1-9d40-592fb7226c95","Type":"ContainerDied","Data":"2af1a8964148e3fedde957f30efd5c8a8020efad1dc7d9c14a07ecf8d0362e28"} Oct 03 14:20:48 crc kubenswrapper[4636]: E1003 14:20:48.371774 4636 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0025da7c_17f3_4036_a9fc_3330508c11cd.slice/crio-79ec2b52d512bd5fdecc8099ed615849a5e48db694fa62ea57cc7a7daeb17a1c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0025da7c_17f3_4036_a9fc_3330508c11cd.slice/crio-conmon-79ec2b52d512bd5fdecc8099ed615849a5e48db694fa62ea57cc7a7daeb17a1c.scope\": RecentStats: unable to find data in memory cache]" Oct 03 14:20:48 crc kubenswrapper[4636]: I1003 14:20:48.448476 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 14:20:48 crc kubenswrapper[4636]: I1003 14:20:48.821489 4636 generic.go:334] "Generic (PLEG): container finished" podID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerID="6f203755d3b7d2412b112d16f7778187f0cdb274206e2b3e4aaeccb274cae768" exitCode=137 Oct 03 14:20:48 crc kubenswrapper[4636]: I1003 14:20:48.821543 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976d47688-kx5v5" event={"ID":"92ef2fa8-5e4e-49f1-8840-01b5be29d036","Type":"ContainerDied","Data":"6f203755d3b7d2412b112d16f7778187f0cdb274206e2b3e4aaeccb274cae768"} Oct 03 14:20:48 crc kubenswrapper[4636]: I1003 14:20:48.821583 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976d47688-kx5v5" event={"ID":"92ef2fa8-5e4e-49f1-8840-01b5be29d036","Type":"ContainerStarted","Data":"04ea3c8403ac5e4db0992d3e740398054c3ed5e4c5fb3d90d47fa1626284e1c7"} Oct 03 14:20:48 crc kubenswrapper[4636]: I1003 14:20:48.827382 4636 generic.go:334] "Generic (PLEG): container finished" podID="0025da7c-17f3-4036-a9fc-3330508c11cd" containerID="79ec2b52d512bd5fdecc8099ed615849a5e48db694fa62ea57cc7a7daeb17a1c" exitCode=137 Oct 03 14:20:48 crc kubenswrapper[4636]: I1003 14:20:48.827472 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c5bc9456-rfvns" event={"ID":"0025da7c-17f3-4036-a9fc-3330508c11cd","Type":"ContainerDied","Data":"79ec2b52d512bd5fdecc8099ed615849a5e48db694fa62ea57cc7a7daeb17a1c"} Oct 03 14:20:48 crc kubenswrapper[4636]: I1003 14:20:48.827540 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c5bc9456-rfvns" event={"ID":"0025da7c-17f3-4036-a9fc-3330508c11cd","Type":"ContainerStarted","Data":"7e567cadd2838aa06c424f97603f8583e935e05e6910d6f0fa8a2cdbd59d026e"} Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.525744 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6766dbb747-7j5j7"] Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.535465 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.540315 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.542155 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.542791 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.545404 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6766dbb747-7j5j7"] Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.657964 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-log-httpd\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.658018 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-etc-swift\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.658059 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-run-httpd\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.658080 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-config-data\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.658129 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-combined-ca-bundle\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.658187 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-internal-tls-certs\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.658211 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh2j7\" (UniqueName: \"kubernetes.io/projected/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-kube-api-access-hh2j7\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.658242 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-public-tls-certs\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.759705 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-internal-tls-certs\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.759754 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh2j7\" (UniqueName: \"kubernetes.io/projected/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-kube-api-access-hh2j7\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.759792 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-public-tls-certs\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.759845 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-log-httpd\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.759875 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-etc-swift\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.759910 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-run-httpd\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.759931 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-config-data\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.760024 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-combined-ca-bundle\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.760512 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-log-httpd\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.760660 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-run-httpd\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.768335 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-config-data\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.770328 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-combined-ca-bundle\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.770351 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-public-tls-certs\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.782441 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-internal-tls-certs\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.794363 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-etc-swift\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.818842 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh2j7\" (UniqueName: \"kubernetes.io/projected/e0a3acac-6d5f-49d7-9b2e-52bd155fb674-kube-api-access-hh2j7\") pod \"swift-proxy-6766dbb747-7j5j7\" (UID: \"e0a3acac-6d5f-49d7-9b2e-52bd155fb674\") " pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:49 crc kubenswrapper[4636]: I1003 14:20:49.880851 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:50 crc kubenswrapper[4636]: I1003 14:20:50.607427 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6766dbb747-7j5j7"] Oct 03 14:20:50 crc kubenswrapper[4636]: W1003 14:20:50.635249 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a3acac_6d5f_49d7_9b2e_52bd155fb674.slice/crio-98dc8d51a1475f508ca673bed6a1bc127c523bce0c2891dd55386e8b95ca76f8 WatchSource:0}: Error finding container 98dc8d51a1475f508ca673bed6a1bc127c523bce0c2891dd55386e8b95ca76f8: Status 404 returned error can't find the container with id 98dc8d51a1475f508ca673bed6a1bc127c523bce0c2891dd55386e8b95ca76f8 Oct 03 14:20:50 crc kubenswrapper[4636]: I1003 14:20:50.877339 4636 generic.go:334] "Generic (PLEG): container finished" podID="3616dd77-ea16-43c1-9d40-592fb7226c95" containerID="07eaec71e05a6ff3da632c7ad94b5d9a035a32b09a6c3fa3c43ff26e3e26f127" exitCode=0 Oct 03 14:20:50 crc kubenswrapper[4636]: I1003 14:20:50.877691 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7585984468-sqdbh" event={"ID":"3616dd77-ea16-43c1-9d40-592fb7226c95","Type":"ContainerDied","Data":"07eaec71e05a6ff3da632c7ad94b5d9a035a32b09a6c3fa3c43ff26e3e26f127"} Oct 03 14:20:50 crc kubenswrapper[4636]: I1003 14:20:50.880412 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6766dbb747-7j5j7" event={"ID":"e0a3acac-6d5f-49d7-9b2e-52bd155fb674","Type":"ContainerStarted","Data":"98dc8d51a1475f508ca673bed6a1bc127c523bce0c2891dd55386e8b95ca76f8"} Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.209646 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.296303 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-ovndb-tls-certs\") pod \"3616dd77-ea16-43c1-9d40-592fb7226c95\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.296346 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-combined-ca-bundle\") pod \"3616dd77-ea16-43c1-9d40-592fb7226c95\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.296435 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-config\") pod \"3616dd77-ea16-43c1-9d40-592fb7226c95\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.296457 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-httpd-config\") pod \"3616dd77-ea16-43c1-9d40-592fb7226c95\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.296483 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhs26\" (UniqueName: \"kubernetes.io/projected/3616dd77-ea16-43c1-9d40-592fb7226c95-kube-api-access-qhs26\") pod \"3616dd77-ea16-43c1-9d40-592fb7226c95\" (UID: \"3616dd77-ea16-43c1-9d40-592fb7226c95\") " Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.354949 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3616dd77-ea16-43c1-9d40-592fb7226c95" (UID: "3616dd77-ea16-43c1-9d40-592fb7226c95"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.357635 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3616dd77-ea16-43c1-9d40-592fb7226c95-kube-api-access-qhs26" (OuterVolumeSpecName: "kube-api-access-qhs26") pod "3616dd77-ea16-43c1-9d40-592fb7226c95" (UID: "3616dd77-ea16-43c1-9d40-592fb7226c95"). InnerVolumeSpecName "kube-api-access-qhs26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.404743 4636 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.404780 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhs26\" (UniqueName: \"kubernetes.io/projected/3616dd77-ea16-43c1-9d40-592fb7226c95-kube-api-access-qhs26\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.443307 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3616dd77-ea16-43c1-9d40-592fb7226c95" (UID: "3616dd77-ea16-43c1-9d40-592fb7226c95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.456591 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3616dd77-ea16-43c1-9d40-592fb7226c95" (UID: "3616dd77-ea16-43c1-9d40-592fb7226c95"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.466770 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-config" (OuterVolumeSpecName: "config") pod "3616dd77-ea16-43c1-9d40-592fb7226c95" (UID: "3616dd77-ea16-43c1-9d40-592fb7226c95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.508307 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.508353 4636 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.508368 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3616dd77-ea16-43c1-9d40-592fb7226c95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.894845 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7585984468-sqdbh" event={"ID":"3616dd77-ea16-43c1-9d40-592fb7226c95","Type":"ContainerDied","Data":"e6967a73fd3f9ba55aa216866f27a89213432b96fe5d45f80679547bf9003346"} Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.894933 4636 scope.go:117] "RemoveContainer" containerID="2af1a8964148e3fedde957f30efd5c8a8020efad1dc7d9c14a07ecf8d0362e28" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.894863 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7585984468-sqdbh" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.896784 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6766dbb747-7j5j7" event={"ID":"e0a3acac-6d5f-49d7-9b2e-52bd155fb674","Type":"ContainerStarted","Data":"4a4b40912b544667dcd7f319e3197bf19553383e1fcea8a9622efdd1771f1ee4"} Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.896817 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6766dbb747-7j5j7" event={"ID":"e0a3acac-6d5f-49d7-9b2e-52bd155fb674","Type":"ContainerStarted","Data":"54449fd57257ee83d9ec87512657fd1ca71f4cff83cddf23018beda303e00183"} Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.896920 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.955836 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6766dbb747-7j5j7" podStartSLOduration=2.955811377 podStartE2EDuration="2.955811377s" podCreationTimestamp="2025-10-03 14:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:20:51.946928727 +0000 UTC m=+1201.805654974" watchObservedRunningTime="2025-10-03 14:20:51.955811377 +0000 UTC m=+1201.814537624" Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.971467 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7585984468-sqdbh"] Oct 03 14:20:51 crc kubenswrapper[4636]: I1003 14:20:51.983663 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7585984468-sqdbh"] Oct 03 14:20:52 crc kubenswrapper[4636]: I1003 14:20:52.807437 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3616dd77-ea16-43c1-9d40-592fb7226c95" path="/var/lib/kubelet/pods/3616dd77-ea16-43c1-9d40-592fb7226c95/volumes" Oct 03 14:20:52 crc kubenswrapper[4636]: I1003 14:20:52.906114 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:53 crc kubenswrapper[4636]: I1003 14:20:53.347719 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:20:53 crc kubenswrapper[4636]: I1003 14:20:53.348061 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="ceilometer-central-agent" containerID="cri-o://3a88bc976bacd430f39b404dc8dfadf6b2c862db4a5ba83918668bb3ee243d84" gracePeriod=30 Oct 03 14:20:53 crc kubenswrapper[4636]: I1003 14:20:53.348204 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="proxy-httpd" containerID="cri-o://a6ad748b87c9a3535ceb9f586f48083f2d1bd233d5773e1951e9e31ad55886c1" gracePeriod=30 Oct 03 14:20:53 crc kubenswrapper[4636]: I1003 14:20:53.348255 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="sg-core" containerID="cri-o://65cb10ffdac695f678230c7ebcb1b6f1734bebb98e527bc2d15b9cd26d0a1acb" gracePeriod=30 Oct 03 14:20:53 crc kubenswrapper[4636]: I1003 14:20:53.348275 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="ceilometer-notification-agent" containerID="cri-o://2093127f95e374b6967df7e40de3db189d17633811478466ab45e5c4767dc7cf" gracePeriod=30 Oct 03 14:20:53 crc kubenswrapper[4636]: I1003 14:20:53.916699 4636 generic.go:334] "Generic (PLEG): container finished" podID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerID="a6ad748b87c9a3535ceb9f586f48083f2d1bd233d5773e1951e9e31ad55886c1" exitCode=0 Oct 03 14:20:53 crc kubenswrapper[4636]: I1003 14:20:53.916741 4636 generic.go:334] "Generic (PLEG): container finished" podID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerID="65cb10ffdac695f678230c7ebcb1b6f1734bebb98e527bc2d15b9cd26d0a1acb" exitCode=2 Oct 03 14:20:53 crc kubenswrapper[4636]: I1003 14:20:53.916781 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8","Type":"ContainerDied","Data":"a6ad748b87c9a3535ceb9f586f48083f2d1bd233d5773e1951e9e31ad55886c1"} Oct 03 14:20:53 crc kubenswrapper[4636]: I1003 14:20:53.916825 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8","Type":"ContainerDied","Data":"65cb10ffdac695f678230c7ebcb1b6f1734bebb98e527bc2d15b9cd26d0a1acb"} Oct 03 14:20:54 crc kubenswrapper[4636]: I1003 14:20:54.086858 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 14:20:54 crc kubenswrapper[4636]: I1003 14:20:54.934940 4636 generic.go:334] "Generic (PLEG): container finished" podID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerID="3a88bc976bacd430f39b404dc8dfadf6b2c862db4a5ba83918668bb3ee243d84" exitCode=0 Oct 03 14:20:54 crc kubenswrapper[4636]: I1003 14:20:54.934990 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8","Type":"ContainerDied","Data":"3a88bc976bacd430f39b404dc8dfadf6b2c862db4a5ba83918668bb3ee243d84"} Oct 03 14:20:55 crc kubenswrapper[4636]: I1003 14:20:55.947392 4636 generic.go:334] "Generic (PLEG): container finished" podID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerID="2093127f95e374b6967df7e40de3db189d17633811478466ab45e5c4767dc7cf" exitCode=0 Oct 03 14:20:55 crc kubenswrapper[4636]: I1003 14:20:55.947462 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8","Type":"ContainerDied","Data":"2093127f95e374b6967df7e40de3db189d17633811478466ab45e5c4767dc7cf"} Oct 03 14:20:57 crc kubenswrapper[4636]: I1003 14:20:57.692091 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:20:57 crc kubenswrapper[4636]: I1003 14:20:57.692417 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:20:57 crc kubenswrapper[4636]: I1003 14:20:57.974422 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:20:57 crc kubenswrapper[4636]: I1003 14:20:57.974461 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.379484 4636 scope.go:117] "RemoveContainer" containerID="07eaec71e05a6ff3da632c7ad94b5d9a035a32b09a6c3fa3c43ff26e3e26f127" Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.860585 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.917001 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.917062 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6766dbb747-7j5j7" Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.971600 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-config-data\") pod \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.971658 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-combined-ca-bundle\") pod \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.971724 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9zv5\" (UniqueName: \"kubernetes.io/projected/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-kube-api-access-h9zv5\") pod \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.971766 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-sg-core-conf-yaml\") pod \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.971845 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-log-httpd\") pod \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.971865 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-run-httpd\") pod \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.971909 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-scripts\") pod \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\" (UID: \"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8\") " Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.976040 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" (UID: "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.982145 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-kube-api-access-h9zv5" (OuterVolumeSpecName: "kube-api-access-h9zv5") pod "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" (UID: "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8"). InnerVolumeSpecName "kube-api-access-h9zv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.995811 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-scripts" (OuterVolumeSpecName: "scripts") pod "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" (UID: "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.995955 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" (UID: "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:20:59 crc kubenswrapper[4636]: I1003 14:20:59.996633 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0a7aa438-f4f0-4975-a0e8-1005b56f8957","Type":"ContainerStarted","Data":"a9129df05c68b1fc385e72692f702f4b05573a3826e28c8082d8fb95267c19db"} Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.011069 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8","Type":"ContainerDied","Data":"2dbcecde24a5d7471fdb4745e0eb311375f9ea933f7baedd22c8e24617070e90"} Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.011138 4636 scope.go:117] "RemoveContainer" containerID="a6ad748b87c9a3535ceb9f586f48083f2d1bd233d5773e1951e9e31ad55886c1" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.011281 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.029378 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.027440571 podStartE2EDuration="22.029352496s" podCreationTimestamp="2025-10-03 14:20:38 +0000 UTC" firstStartedPulling="2025-10-03 14:20:39.551703113 +0000 UTC m=+1189.410429360" lastFinishedPulling="2025-10-03 14:20:59.553615038 +0000 UTC m=+1209.412341285" observedRunningTime="2025-10-03 14:21:00.02104954 +0000 UTC m=+1209.879775787" watchObservedRunningTime="2025-10-03 14:21:00.029352496 +0000 UTC m=+1209.888078743" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.064989 4636 scope.go:117] "RemoveContainer" containerID="65cb10ffdac695f678230c7ebcb1b6f1734bebb98e527bc2d15b9cd26d0a1acb" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.064989 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" (UID: "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.074847 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9zv5\" (UniqueName: \"kubernetes.io/projected/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-kube-api-access-h9zv5\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.074875 4636 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.074884 4636 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.074893 4636 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.074903 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.108799 4636 scope.go:117] "RemoveContainer" containerID="2093127f95e374b6967df7e40de3db189d17633811478466ab45e5c4767dc7cf" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.135756 4636 scope.go:117] "RemoveContainer" containerID="3a88bc976bacd430f39b404dc8dfadf6b2c862db4a5ba83918668bb3ee243d84" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.155356 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-config-data" (OuterVolumeSpecName: "config-data") pod "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" (UID: "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.159301 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" (UID: "b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.176540 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.176580 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.352526 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.359824 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375063 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:00 crc kubenswrapper[4636]: E1003 14:21:00.375517 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3616dd77-ea16-43c1-9d40-592fb7226c95" containerName="neutron-api" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375541 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="3616dd77-ea16-43c1-9d40-592fb7226c95" containerName="neutron-api" Oct 03 14:21:00 crc kubenswrapper[4636]: E1003 14:21:00.375562 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="ceilometer-central-agent" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375570 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="ceilometer-central-agent" Oct 03 14:21:00 crc kubenswrapper[4636]: E1003 14:21:00.375589 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="sg-core" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375597 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="sg-core" Oct 03 14:21:00 crc kubenswrapper[4636]: E1003 14:21:00.375610 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="ceilometer-notification-agent" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375619 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="ceilometer-notification-agent" Oct 03 14:21:00 crc kubenswrapper[4636]: E1003 14:21:00.375628 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="proxy-httpd" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375635 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="proxy-httpd" Oct 03 14:21:00 crc kubenswrapper[4636]: E1003 14:21:00.375643 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3616dd77-ea16-43c1-9d40-592fb7226c95" containerName="neutron-httpd" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375649 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="3616dd77-ea16-43c1-9d40-592fb7226c95" containerName="neutron-httpd" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375894 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="proxy-httpd" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375913 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="ceilometer-notification-agent" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375928 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="ceilometer-central-agent" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375947 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" containerName="sg-core" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375956 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="3616dd77-ea16-43c1-9d40-592fb7226c95" containerName="neutron-api" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.375969 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="3616dd77-ea16-43c1-9d40-592fb7226c95" containerName="neutron-httpd" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.404476 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.404588 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.413021 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.413248 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.489133 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.489173 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-log-httpd\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.489233 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-run-httpd\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.489257 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-scripts\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.489304 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj2m8\" (UniqueName: \"kubernetes.io/projected/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-kube-api-access-pj2m8\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.489333 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-config-data\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.489398 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.590713 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj2m8\" (UniqueName: \"kubernetes.io/projected/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-kube-api-access-pj2m8\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.590760 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-config-data\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.590819 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.590876 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.590893 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-log-httpd\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.590934 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-run-httpd\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.590953 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-scripts\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.591643 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-log-httpd\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.591679 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-run-httpd\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.596259 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-config-data\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.596892 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-scripts\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.597587 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.605595 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.616071 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj2m8\" (UniqueName: \"kubernetes.io/projected/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-kube-api-access-pj2m8\") pod \"ceilometer-0\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.724697 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.724913 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="25cea5cd-0d10-4569-952f-a884d6478382" containerName="kube-state-metrics" containerID="cri-o://2c77e85082d5dc686be74e7abede4ccbb2b24ed06f947e770b4850e36939b31f" gracePeriod=30 Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.760132 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:00 crc kubenswrapper[4636]: I1003 14:21:00.818946 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8" path="/var/lib/kubelet/pods/b7d511ee-3e7e-4ede-9bb7-fdae52b1b0b8/volumes" Oct 03 14:21:01 crc kubenswrapper[4636]: I1003 14:21:01.045392 4636 generic.go:334] "Generic (PLEG): container finished" podID="25cea5cd-0d10-4569-952f-a884d6478382" containerID="2c77e85082d5dc686be74e7abede4ccbb2b24ed06f947e770b4850e36939b31f" exitCode=2 Oct 03 14:21:01 crc kubenswrapper[4636]: I1003 14:21:01.045471 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"25cea5cd-0d10-4569-952f-a884d6478382","Type":"ContainerDied","Data":"2c77e85082d5dc686be74e7abede4ccbb2b24ed06f947e770b4850e36939b31f"} Oct 03 14:21:01 crc kubenswrapper[4636]: I1003 14:21:01.328813 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:01 crc kubenswrapper[4636]: I1003 14:21:01.330725 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 14:21:01 crc kubenswrapper[4636]: I1003 14:21:01.413955 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7z9f\" (UniqueName: \"kubernetes.io/projected/25cea5cd-0d10-4569-952f-a884d6478382-kube-api-access-g7z9f\") pod \"25cea5cd-0d10-4569-952f-a884d6478382\" (UID: \"25cea5cd-0d10-4569-952f-a884d6478382\") " Oct 03 14:21:01 crc kubenswrapper[4636]: I1003 14:21:01.427384 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25cea5cd-0d10-4569-952f-a884d6478382-kube-api-access-g7z9f" (OuterVolumeSpecName: "kube-api-access-g7z9f") pod "25cea5cd-0d10-4569-952f-a884d6478382" (UID: "25cea5cd-0d10-4569-952f-a884d6478382"). InnerVolumeSpecName "kube-api-access-g7z9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:01 crc kubenswrapper[4636]: I1003 14:21:01.516257 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7z9f\" (UniqueName: \"kubernetes.io/projected/25cea5cd-0d10-4569-952f-a884d6478382-kube-api-access-g7z9f\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.087254 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c","Type":"ContainerStarted","Data":"f8aee5558ee5bbcdb9e0fac223549fe9225b29f47b8d2d9ca7e1f45e8384fe0e"} Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.116434 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"25cea5cd-0d10-4569-952f-a884d6478382","Type":"ContainerDied","Data":"80dc665eeadb9b61aa789e370d9001a9d44833811ed6dfcdf24cb403d9c1b380"} Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.116485 4636 scope.go:117] "RemoveContainer" containerID="2c77e85082d5dc686be74e7abede4ccbb2b24ed06f947e770b4850e36939b31f" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.116606 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.198053 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.224252 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.234432 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:21:02 crc kubenswrapper[4636]: E1003 14:21:02.234853 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25cea5cd-0d10-4569-952f-a884d6478382" containerName="kube-state-metrics" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.234872 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="25cea5cd-0d10-4569-952f-a884d6478382" containerName="kube-state-metrics" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.235105 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="25cea5cd-0d10-4569-952f-a884d6478382" containerName="kube-state-metrics" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.235721 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.241275 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.241497 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.252895 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.347833 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb8bv\" (UniqueName: \"kubernetes.io/projected/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-kube-api-access-pb8bv\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.347927 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.348208 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.348268 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.449653 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.449967 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.450002 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb8bv\" (UniqueName: \"kubernetes.io/projected/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-kube-api-access-pb8bv\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.450041 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.454280 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.458617 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.467085 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.473804 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb8bv\" (UniqueName: \"kubernetes.io/projected/f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0-kube-api-access-pb8bv\") pod \"kube-state-metrics-0\" (UID: \"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0\") " pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.559346 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 14:21:02 crc kubenswrapper[4636]: I1003 14:21:02.823044 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25cea5cd-0d10-4569-952f-a884d6478382" path="/var/lib/kubelet/pods/25cea5cd-0d10-4569-952f-a884d6478382/volumes" Oct 03 14:21:03 crc kubenswrapper[4636]: I1003 14:21:03.124092 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 14:21:03 crc kubenswrapper[4636]: I1003 14:21:03.134914 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c","Type":"ContainerStarted","Data":"c9e2b973779766890256b7bcc9c85345aa5aa687ccd4761c8bd0c11d74c17ec2"} Oct 03 14:21:03 crc kubenswrapper[4636]: I1003 14:21:03.574663 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:21:03 crc kubenswrapper[4636]: I1003 14:21:03.576059 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fdd6b4ee-372a-42a0-a353-b3a82463d3ff" containerName="glance-log" containerID="cri-o://4c704ee2e692fc51c22595e16d7e80038cffaa667db593234f019af812029b0c" gracePeriod=30 Oct 03 14:21:03 crc kubenswrapper[4636]: I1003 14:21:03.576146 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fdd6b4ee-372a-42a0-a353-b3a82463d3ff" containerName="glance-httpd" containerID="cri-o://eeec22460d487729f2abc1d313a5bd005c59d08335b1724f0b02955f839bb9f1" gracePeriod=30 Oct 03 14:21:04 crc kubenswrapper[4636]: I1003 14:21:04.148769 4636 generic.go:334] "Generic (PLEG): container finished" podID="fdd6b4ee-372a-42a0-a353-b3a82463d3ff" containerID="4c704ee2e692fc51c22595e16d7e80038cffaa667db593234f019af812029b0c" exitCode=143 Oct 03 14:21:04 crc kubenswrapper[4636]: I1003 14:21:04.148820 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdd6b4ee-372a-42a0-a353-b3a82463d3ff","Type":"ContainerDied","Data":"4c704ee2e692fc51c22595e16d7e80038cffaa667db593234f019af812029b0c"} Oct 03 14:21:04 crc kubenswrapper[4636]: I1003 14:21:04.150358 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0","Type":"ContainerStarted","Data":"b9d2a2c8eeacc862cef0c263975c57ccc43721bd7a4c27f8f5fc1f6780083331"} Oct 03 14:21:04 crc kubenswrapper[4636]: I1003 14:21:04.150395 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0","Type":"ContainerStarted","Data":"60e7eef8be5748fbdc06c3778a67c3293613a6c9ffdc6fc18c2fdfd2176c1973"} Oct 03 14:21:04 crc kubenswrapper[4636]: I1003 14:21:04.152014 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c","Type":"ContainerStarted","Data":"220727fc6e6a0a8ae5e4aa31aabbafe261276d69081bd0a05da05a92f2a68e40"} Oct 03 14:21:04 crc kubenswrapper[4636]: I1003 14:21:04.152067 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c","Type":"ContainerStarted","Data":"4015dd2c49a7bb8c6ca853fa2b3a29703dec5543f5eb40be4eb0d13315c31704"} Oct 03 14:21:04 crc kubenswrapper[4636]: I1003 14:21:04.513460 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.006375488 podStartE2EDuration="2.51343625s" podCreationTimestamp="2025-10-03 14:21:02 +0000 UTC" firstStartedPulling="2025-10-03 14:21:03.134665183 +0000 UTC m=+1212.993391430" lastFinishedPulling="2025-10-03 14:21:03.641725945 +0000 UTC m=+1213.500452192" observedRunningTime="2025-10-03 14:21:04.176670401 +0000 UTC m=+1214.035396648" watchObservedRunningTime="2025-10-03 14:21:04.51343625 +0000 UTC m=+1214.372162497" Oct 03 14:21:04 crc kubenswrapper[4636]: I1003 14:21:04.514177 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:04 crc kubenswrapper[4636]: I1003 14:21:04.789897 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:21:04 crc kubenswrapper[4636]: I1003 14:21:04.791452 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8fbf639b-5c21-47d7-9596-091f6b796167" containerName="glance-httpd" containerID="cri-o://90137aba5d1d67c92f563994412f3500af0959d309d1ae4e0938c4819741745c" gracePeriod=30 Oct 03 14:21:04 crc kubenswrapper[4636]: I1003 14:21:04.791826 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8fbf639b-5c21-47d7-9596-091f6b796167" containerName="glance-log" containerID="cri-o://07220c4837c958438f7d722ed3c4b264b8bc20c858517d2c129d21ba2de945a8" gracePeriod=30 Oct 03 14:21:05 crc kubenswrapper[4636]: I1003 14:21:05.167478 4636 generic.go:334] "Generic (PLEG): container finished" podID="8fbf639b-5c21-47d7-9596-091f6b796167" containerID="07220c4837c958438f7d722ed3c4b264b8bc20c858517d2c129d21ba2de945a8" exitCode=143 Oct 03 14:21:05 crc kubenswrapper[4636]: I1003 14:21:05.168302 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8fbf639b-5c21-47d7-9596-091f6b796167","Type":"ContainerDied","Data":"07220c4837c958438f7d722ed3c4b264b8bc20c858517d2c129d21ba2de945a8"} Oct 03 14:21:05 crc kubenswrapper[4636]: I1003 14:21:05.168830 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 14:21:06 crc kubenswrapper[4636]: I1003 14:21:06.177841 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c","Type":"ContainerStarted","Data":"34df0b62b3ff0e9c443496335bb071af03d516fde599f660d6b27e8bc028fa21"} Oct 03 14:21:06 crc kubenswrapper[4636]: I1003 14:21:06.178287 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="ceilometer-central-agent" containerID="cri-o://c9e2b973779766890256b7bcc9c85345aa5aa687ccd4761c8bd0c11d74c17ec2" gracePeriod=30 Oct 03 14:21:06 crc kubenswrapper[4636]: I1003 14:21:06.178529 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="proxy-httpd" containerID="cri-o://34df0b62b3ff0e9c443496335bb071af03d516fde599f660d6b27e8bc028fa21" gracePeriod=30 Oct 03 14:21:06 crc kubenswrapper[4636]: I1003 14:21:06.178669 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="ceilometer-notification-agent" containerID="cri-o://4015dd2c49a7bb8c6ca853fa2b3a29703dec5543f5eb40be4eb0d13315c31704" gracePeriod=30 Oct 03 14:21:06 crc kubenswrapper[4636]: I1003 14:21:06.178715 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="sg-core" containerID="cri-o://220727fc6e6a0a8ae5e4aa31aabbafe261276d69081bd0a05da05a92f2a68e40" gracePeriod=30 Oct 03 14:21:06 crc kubenswrapper[4636]: I1003 14:21:06.207670 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.287552527 podStartE2EDuration="6.20764604s" podCreationTimestamp="2025-10-03 14:21:00 +0000 UTC" firstStartedPulling="2025-10-03 14:21:01.350546227 +0000 UTC m=+1211.209272474" lastFinishedPulling="2025-10-03 14:21:05.27063974 +0000 UTC m=+1215.129365987" observedRunningTime="2025-10-03 14:21:06.199600951 +0000 UTC m=+1216.058327198" watchObservedRunningTime="2025-10-03 14:21:06.20764604 +0000 UTC m=+1216.066372287" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.196229 4636 generic.go:334] "Generic (PLEG): container finished" podID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerID="34df0b62b3ff0e9c443496335bb071af03d516fde599f660d6b27e8bc028fa21" exitCode=0 Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.197269 4636 generic.go:334] "Generic (PLEG): container finished" podID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerID="220727fc6e6a0a8ae5e4aa31aabbafe261276d69081bd0a05da05a92f2a68e40" exitCode=2 Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.197289 4636 generic.go:334] "Generic (PLEG): container finished" podID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerID="4015dd2c49a7bb8c6ca853fa2b3a29703dec5543f5eb40be4eb0d13315c31704" exitCode=0 Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.197340 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c","Type":"ContainerDied","Data":"34df0b62b3ff0e9c443496335bb071af03d516fde599f660d6b27e8bc028fa21"} Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.197384 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c","Type":"ContainerDied","Data":"220727fc6e6a0a8ae5e4aa31aabbafe261276d69081bd0a05da05a92f2a68e40"} Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.197397 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c","Type":"ContainerDied","Data":"4015dd2c49a7bb8c6ca853fa2b3a29703dec5543f5eb40be4eb0d13315c31704"} Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.210906 4636 generic.go:334] "Generic (PLEG): container finished" podID="fdd6b4ee-372a-42a0-a353-b3a82463d3ff" containerID="eeec22460d487729f2abc1d313a5bd005c59d08335b1724f0b02955f839bb9f1" exitCode=0 Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.210955 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdd6b4ee-372a-42a0-a353-b3a82463d3ff","Type":"ContainerDied","Data":"eeec22460d487729f2abc1d313a5bd005c59d08335b1724f0b02955f839bb9f1"} Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.447078 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.543375 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-public-tls-certs\") pod \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.543550 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-logs\") pod \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.543570 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-combined-ca-bundle\") pod \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.543609 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-config-data\") pod \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.543628 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-scripts\") pod \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.543651 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs2hb\" (UniqueName: \"kubernetes.io/projected/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-kube-api-access-gs2hb\") pod \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.543691 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.543734 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-httpd-run\") pod \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\" (UID: \"fdd6b4ee-372a-42a0-a353-b3a82463d3ff\") " Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.544345 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fdd6b4ee-372a-42a0-a353-b3a82463d3ff" (UID: "fdd6b4ee-372a-42a0-a353-b3a82463d3ff"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.550537 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-logs" (OuterVolumeSpecName: "logs") pod "fdd6b4ee-372a-42a0-a353-b3a82463d3ff" (UID: "fdd6b4ee-372a-42a0-a353-b3a82463d3ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.584630 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "fdd6b4ee-372a-42a0-a353-b3a82463d3ff" (UID: "fdd6b4ee-372a-42a0-a353-b3a82463d3ff"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.584734 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-scripts" (OuterVolumeSpecName: "scripts") pod "fdd6b4ee-372a-42a0-a353-b3a82463d3ff" (UID: "fdd6b4ee-372a-42a0-a353-b3a82463d3ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.584835 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-kube-api-access-gs2hb" (OuterVolumeSpecName: "kube-api-access-gs2hb") pod "fdd6b4ee-372a-42a0-a353-b3a82463d3ff" (UID: "fdd6b4ee-372a-42a0-a353-b3a82463d3ff"). InnerVolumeSpecName "kube-api-access-gs2hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.627810 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdd6b4ee-372a-42a0-a353-b3a82463d3ff" (UID: "fdd6b4ee-372a-42a0-a353-b3a82463d3ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.645447 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.645496 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.645509 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.645520 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs2hb\" (UniqueName: \"kubernetes.io/projected/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-kube-api-access-gs2hb\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.645548 4636 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.645579 4636 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.696510 4636 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.702784 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7976d47688-kx5v5" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.734222 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fdd6b4ee-372a-42a0-a353-b3a82463d3ff" (UID: "fdd6b4ee-372a-42a0-a353-b3a82463d3ff"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.737479 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-config-data" (OuterVolumeSpecName: "config-data") pod "fdd6b4ee-372a-42a0-a353-b3a82463d3ff" (UID: "fdd6b4ee-372a-42a0-a353-b3a82463d3ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.749227 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.749259 4636 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.749269 4636 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd6b4ee-372a-42a0-a353-b3a82463d3ff-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:07 crc kubenswrapper[4636]: I1003 14:21:07.974912 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8c5bc9456-rfvns" podUID="0025da7c-17f3-4036-a9fc-3330508c11cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.124392 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="8fbf639b-5c21-47d7-9596-091f6b796167" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.151:9292/healthcheck\": dial tcp 10.217.0.151:9292: connect: connection refused" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.124489 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="8fbf639b-5c21-47d7-9596-091f6b796167" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9292/healthcheck\": dial tcp 10.217.0.151:9292: connect: connection refused" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.224431 4636 generic.go:334] "Generic (PLEG): container finished" podID="8fbf639b-5c21-47d7-9596-091f6b796167" containerID="90137aba5d1d67c92f563994412f3500af0959d309d1ae4e0938c4819741745c" exitCode=0 Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.224477 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8fbf639b-5c21-47d7-9596-091f6b796167","Type":"ContainerDied","Data":"90137aba5d1d67c92f563994412f3500af0959d309d1ae4e0938c4819741745c"} Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.229160 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdd6b4ee-372a-42a0-a353-b3a82463d3ff","Type":"ContainerDied","Data":"68d2c7efcfa8d8d9cdbf850cb801611634bb7eff15d0eba7ccfa1d44eb43db54"} Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.229207 4636 scope.go:117] "RemoveContainer" containerID="eeec22460d487729f2abc1d313a5bd005c59d08335b1724f0b02955f839bb9f1" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.229263 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.277407 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.279177 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.296079 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:21:08 crc kubenswrapper[4636]: E1003 14:21:08.296444 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd6b4ee-372a-42a0-a353-b3a82463d3ff" containerName="glance-httpd" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.296460 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd6b4ee-372a-42a0-a353-b3a82463d3ff" containerName="glance-httpd" Oct 03 14:21:08 crc kubenswrapper[4636]: E1003 14:21:08.296488 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd6b4ee-372a-42a0-a353-b3a82463d3ff" containerName="glance-log" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.296495 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd6b4ee-372a-42a0-a353-b3a82463d3ff" containerName="glance-log" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.296664 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd6b4ee-372a-42a0-a353-b3a82463d3ff" containerName="glance-httpd" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.296688 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd6b4ee-372a-42a0-a353-b3a82463d3ff" containerName="glance-log" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.297863 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.298728 4636 scope.go:117] "RemoveContainer" containerID="4c704ee2e692fc51c22595e16d7e80038cffaa667db593234f019af812029b0c" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.301424 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.301742 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.333945 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.467345 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.467381 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5hqp\" (UniqueName: \"kubernetes.io/projected/48268aa0-45d6-42d4-a902-6f9221eae8d7-kube-api-access-w5hqp\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.467398 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.467436 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48268aa0-45d6-42d4-a902-6f9221eae8d7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.467508 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48268aa0-45d6-42d4-a902-6f9221eae8d7-logs\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.467528 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.467574 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-config-data\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.467599 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-scripts\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.569093 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.569172 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5hqp\" (UniqueName: \"kubernetes.io/projected/48268aa0-45d6-42d4-a902-6f9221eae8d7-kube-api-access-w5hqp\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.569814 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.569864 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48268aa0-45d6-42d4-a902-6f9221eae8d7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.569908 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48268aa0-45d6-42d4-a902-6f9221eae8d7-logs\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.569927 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.569965 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-config-data\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.570463 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-scripts\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.570466 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.570526 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48268aa0-45d6-42d4-a902-6f9221eae8d7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.570949 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48268aa0-45d6-42d4-a902-6f9221eae8d7-logs\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.577768 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-scripts\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.578317 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.580576 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-config-data\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.580695 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48268aa0-45d6-42d4-a902-6f9221eae8d7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.597968 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5hqp\" (UniqueName: \"kubernetes.io/projected/48268aa0-45d6-42d4-a902-6f9221eae8d7-kube-api-access-w5hqp\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.635040 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"48268aa0-45d6-42d4-a902-6f9221eae8d7\") " pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.694780 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.775379 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-config-data\") pod \"8fbf639b-5c21-47d7-9596-091f6b796167\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.775439 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-internal-tls-certs\") pod \"8fbf639b-5c21-47d7-9596-091f6b796167\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.775485 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-logs\") pod \"8fbf639b-5c21-47d7-9596-091f6b796167\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.775519 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8fbf639b-5c21-47d7-9596-091f6b796167\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.775576 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-httpd-run\") pod \"8fbf639b-5c21-47d7-9596-091f6b796167\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.775602 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-combined-ca-bundle\") pod \"8fbf639b-5c21-47d7-9596-091f6b796167\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.775734 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-scripts\") pod \"8fbf639b-5c21-47d7-9596-091f6b796167\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.775799 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llsnv\" (UniqueName: \"kubernetes.io/projected/8fbf639b-5c21-47d7-9596-091f6b796167-kube-api-access-llsnv\") pod \"8fbf639b-5c21-47d7-9596-091f6b796167\" (UID: \"8fbf639b-5c21-47d7-9596-091f6b796167\") " Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.780040 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-logs" (OuterVolumeSpecName: "logs") pod "8fbf639b-5c21-47d7-9596-091f6b796167" (UID: "8fbf639b-5c21-47d7-9596-091f6b796167"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.811726 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8fbf639b-5c21-47d7-9596-091f6b796167" (UID: "8fbf639b-5c21-47d7-9596-091f6b796167"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.811818 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbf639b-5c21-47d7-9596-091f6b796167-kube-api-access-llsnv" (OuterVolumeSpecName: "kube-api-access-llsnv") pod "8fbf639b-5c21-47d7-9596-091f6b796167" (UID: "8fbf639b-5c21-47d7-9596-091f6b796167"). InnerVolumeSpecName "kube-api-access-llsnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.825356 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "8fbf639b-5c21-47d7-9596-091f6b796167" (UID: "8fbf639b-5c21-47d7-9596-091f6b796167"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.825525 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd6b4ee-372a-42a0-a353-b3a82463d3ff" path="/var/lib/kubelet/pods/fdd6b4ee-372a-42a0-a353-b3a82463d3ff/volumes" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.837524 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-scripts" (OuterVolumeSpecName: "scripts") pod "8fbf639b-5c21-47d7-9596-091f6b796167" (UID: "8fbf639b-5c21-47d7-9596-091f6b796167"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.872788 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8fbf639b-5c21-47d7-9596-091f6b796167" (UID: "8fbf639b-5c21-47d7-9596-091f6b796167"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.881669 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.881699 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llsnv\" (UniqueName: \"kubernetes.io/projected/8fbf639b-5c21-47d7-9596-091f6b796167-kube-api-access-llsnv\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.881711 4636 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.881719 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.881737 4636 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.881746 4636 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fbf639b-5c21-47d7-9596-091f6b796167-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.915751 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fbf639b-5c21-47d7-9596-091f6b796167" (UID: "8fbf639b-5c21-47d7-9596-091f6b796167"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.928176 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.928226 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-config-data" (OuterVolumeSpecName: "config-data") pod "8fbf639b-5c21-47d7-9596-091f6b796167" (UID: "8fbf639b-5c21-47d7-9596-091f6b796167"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.955818 4636 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.987192 4636 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.987438 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:08 crc kubenswrapper[4636]: I1003 14:21:08.987507 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbf639b-5c21-47d7-9596-091f6b796167-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.247525 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8fbf639b-5c21-47d7-9596-091f6b796167","Type":"ContainerDied","Data":"149adb4504287664c60ae6f29eb7c4cf06b834e2f0815c9e5d62066d3b9ae392"} Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.247822 4636 scope.go:117] "RemoveContainer" containerID="90137aba5d1d67c92f563994412f3500af0959d309d1ae4e0938c4819741745c" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.247966 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.297458 4636 scope.go:117] "RemoveContainer" containerID="07220c4837c958438f7d722ed3c4b264b8bc20c858517d2c129d21ba2de945a8" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.310875 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.339799 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.356900 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:21:09 crc kubenswrapper[4636]: E1003 14:21:09.357376 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbf639b-5c21-47d7-9596-091f6b796167" containerName="glance-log" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.357392 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbf639b-5c21-47d7-9596-091f6b796167" containerName="glance-log" Oct 03 14:21:09 crc kubenswrapper[4636]: E1003 14:21:09.357416 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbf639b-5c21-47d7-9596-091f6b796167" containerName="glance-httpd" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.357423 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbf639b-5c21-47d7-9596-091f6b796167" containerName="glance-httpd" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.357590 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbf639b-5c21-47d7-9596-091f6b796167" containerName="glance-log" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.357606 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbf639b-5c21-47d7-9596-091f6b796167" containerName="glance-httpd" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.358512 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.360656 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.364274 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.364435 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.505564 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hqfg\" (UniqueName: \"kubernetes.io/projected/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-kube-api-access-8hqfg\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.505615 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.505667 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-logs\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.505693 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.505710 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.505740 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.505764 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.505780 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.606927 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.606998 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.607119 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hqfg\" (UniqueName: \"kubernetes.io/projected/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-kube-api-access-8hqfg\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.607155 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.607223 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-logs\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.607254 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.607277 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.607315 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.608793 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.608941 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-logs\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.609055 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.619952 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.627384 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.630611 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.642297 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.666442 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hqfg\" (UniqueName: \"kubernetes.io/projected/8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4-kube-api-access-8hqfg\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.695256 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 14:21:09 crc kubenswrapper[4636]: I1003 14:21:09.730573 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4\") " pod="openstack/glance-default-internal-api-0" Oct 03 14:21:10 crc kubenswrapper[4636]: I1003 14:21:10.024988 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:10 crc kubenswrapper[4636]: I1003 14:21:10.280958 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48268aa0-45d6-42d4-a902-6f9221eae8d7","Type":"ContainerStarted","Data":"749e026a69934087f7d1576df096748df19a6f795d2e6a3e9784b9fde4824221"} Oct 03 14:21:10 crc kubenswrapper[4636]: I1003 14:21:10.719843 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 14:21:10 crc kubenswrapper[4636]: W1003 14:21:10.728117 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f9d57db_69a6_4123_bcd5_a1b83b1c9cc4.slice/crio-4b1845e89ed7209f3270e8363fb2c47a4cc72859ccdd439437336f48c545bcb6 WatchSource:0}: Error finding container 4b1845e89ed7209f3270e8363fb2c47a4cc72859ccdd439437336f48c545bcb6: Status 404 returned error can't find the container with id 4b1845e89ed7209f3270e8363fb2c47a4cc72859ccdd439437336f48c545bcb6 Oct 03 14:21:10 crc kubenswrapper[4636]: I1003 14:21:10.817695 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fbf639b-5c21-47d7-9596-091f6b796167" path="/var/lib/kubelet/pods/8fbf639b-5c21-47d7-9596-091f6b796167/volumes" Oct 03 14:21:11 crc kubenswrapper[4636]: I1003 14:21:11.302014 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48268aa0-45d6-42d4-a902-6f9221eae8d7","Type":"ContainerStarted","Data":"ff2fe18b3531758d6687126dc512b40361973b96916f4b04e8137d94f89e6c4e"} Oct 03 14:21:11 crc kubenswrapper[4636]: I1003 14:21:11.305801 4636 generic.go:334] "Generic (PLEG): container finished" podID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerID="c9e2b973779766890256b7bcc9c85345aa5aa687ccd4761c8bd0c11d74c17ec2" exitCode=0 Oct 03 14:21:11 crc kubenswrapper[4636]: I1003 14:21:11.305866 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c","Type":"ContainerDied","Data":"c9e2b973779766890256b7bcc9c85345aa5aa687ccd4761c8bd0c11d74c17ec2"} Oct 03 14:21:11 crc kubenswrapper[4636]: I1003 14:21:11.307269 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4","Type":"ContainerStarted","Data":"4b1845e89ed7209f3270e8363fb2c47a4cc72859ccdd439437336f48c545bcb6"} Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.338310 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48268aa0-45d6-42d4-a902-6f9221eae8d7","Type":"ContainerStarted","Data":"4a09327b40e28205e1bb24d207411b4c05d507928188d9b7d8fcd36bfa360ee5"} Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.341053 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4","Type":"ContainerStarted","Data":"7b06af0f42b32fdbde88294f521f7c53915fdf7c3dfb69ae97a353eed47feec4"} Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.375169 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.375146026 podStartE2EDuration="4.375146026s" podCreationTimestamp="2025-10-03 14:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:21:12.371381518 +0000 UTC m=+1222.230107775" watchObservedRunningTime="2025-10-03 14:21:12.375146026 +0000 UTC m=+1222.233872273" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.569472 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.571260 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.683855 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-run-httpd\") pod \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.684331 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-combined-ca-bundle\") pod \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.684391 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-log-httpd\") pod \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.684442 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-config-data\") pod \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.684513 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-sg-core-conf-yaml\") pod \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.684607 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-scripts\") pod \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.684639 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj2m8\" (UniqueName: \"kubernetes.io/projected/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-kube-api-access-pj2m8\") pod \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\" (UID: \"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c\") " Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.687195 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" (UID: "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.691320 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" (UID: "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.722426 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-scripts" (OuterVolumeSpecName: "scripts") pod "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" (UID: "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.722575 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-kube-api-access-pj2m8" (OuterVolumeSpecName: "kube-api-access-pj2m8") pod "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" (UID: "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c"). InnerVolumeSpecName "kube-api-access-pj2m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.792145 4636 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.792323 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.792410 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj2m8\" (UniqueName: \"kubernetes.io/projected/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-kube-api-access-pj2m8\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.792495 4636 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.810321 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" (UID: "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.877939 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" (UID: "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.894812 4636 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.895033 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.967309 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-config-data" (OuterVolumeSpecName: "config-data") pod "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" (UID: "9d7c2f77-1143-4f1c-b9fe-75cb1ded287c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:12 crc kubenswrapper[4636]: I1003 14:21:12.996492 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.350685 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4","Type":"ContainerStarted","Data":"e4e1b15313c963d358d21512c555e1f38224fa708cf3dcccbeef06445bd53952"} Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.354568 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.364521 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d7c2f77-1143-4f1c-b9fe-75cb1ded287c","Type":"ContainerDied","Data":"f8aee5558ee5bbcdb9e0fac223549fe9225b29f47b8d2d9ca7e1f45e8384fe0e"} Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.364564 4636 scope.go:117] "RemoveContainer" containerID="34df0b62b3ff0e9c443496335bb071af03d516fde599f660d6b27e8bc028fa21" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.389069 4636 scope.go:117] "RemoveContainer" containerID="220727fc6e6a0a8ae5e4aa31aabbafe261276d69081bd0a05da05a92f2a68e40" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.389087 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.389071284 podStartE2EDuration="4.389071284s" podCreationTimestamp="2025-10-03 14:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:21:13.387428811 +0000 UTC m=+1223.246155058" watchObservedRunningTime="2025-10-03 14:21:13.389071284 +0000 UTC m=+1223.247797531" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.414912 4636 scope.go:117] "RemoveContainer" containerID="4015dd2c49a7bb8c6ca853fa2b3a29703dec5543f5eb40be4eb0d13315c31704" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.427957 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.434998 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.464152 4636 scope.go:117] "RemoveContainer" containerID="c9e2b973779766890256b7bcc9c85345aa5aa687ccd4761c8bd0c11d74c17ec2" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.466578 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:13 crc kubenswrapper[4636]: E1003 14:21:13.466934 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="ceilometer-central-agent" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.466950 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="ceilometer-central-agent" Oct 03 14:21:13 crc kubenswrapper[4636]: E1003 14:21:13.466994 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="sg-core" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.467002 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="sg-core" Oct 03 14:21:13 crc kubenswrapper[4636]: E1003 14:21:13.467014 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="ceilometer-notification-agent" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.467023 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="ceilometer-notification-agent" Oct 03 14:21:13 crc kubenswrapper[4636]: E1003 14:21:13.467033 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="proxy-httpd" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.467039 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="proxy-httpd" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.467228 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="sg-core" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.467247 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="ceilometer-notification-agent" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.467265 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="proxy-httpd" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.467279 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" containerName="ceilometer-central-agent" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.468906 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.487372 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.488188 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.488365 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.499376 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.609566 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.609648 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-config-data\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.609702 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-run-httpd\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.609770 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7f7s\" (UniqueName: \"kubernetes.io/projected/c6522439-da96-4e38-bf00-55a0fe9440e5-kube-api-access-r7f7s\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.609804 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-log-httpd\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.609833 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-scripts\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.609862 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.609886 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.710950 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-run-httpd\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.711038 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7f7s\" (UniqueName: \"kubernetes.io/projected/c6522439-da96-4e38-bf00-55a0fe9440e5-kube-api-access-r7f7s\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.711062 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-log-httpd\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.711084 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-scripts\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.711125 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.711165 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.711251 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.711296 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-config-data\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.712201 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-log-httpd\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.712406 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-run-httpd\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.717396 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-scripts\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.719648 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-config-data\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.719730 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.720196 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.726038 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.728815 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7f7s\" (UniqueName: \"kubernetes.io/projected/c6522439-da96-4e38-bf00-55a0fe9440e5-kube-api-access-r7f7s\") pod \"ceilometer-0\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " pod="openstack/ceilometer-0" Oct 03 14:21:13 crc kubenswrapper[4636]: I1003 14:21:13.852923 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:14 crc kubenswrapper[4636]: I1003 14:21:14.420328 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:14 crc kubenswrapper[4636]: I1003 14:21:14.803258 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d7c2f77-1143-4f1c-b9fe-75cb1ded287c" path="/var/lib/kubelet/pods/9d7c2f77-1143-4f1c-b9fe-75cb1ded287c/volumes" Oct 03 14:21:15 crc kubenswrapper[4636]: I1003 14:21:15.255040 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:15 crc kubenswrapper[4636]: I1003 14:21:15.380925 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6522439-da96-4e38-bf00-55a0fe9440e5","Type":"ContainerStarted","Data":"9227ef7e4c6e33e2458127568b9b034881cc76de197f93a7916ff8ecc5392bfb"} Oct 03 14:21:15 crc kubenswrapper[4636]: I1003 14:21:15.380975 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6522439-da96-4e38-bf00-55a0fe9440e5","Type":"ContainerStarted","Data":"e02e679efcf1e0214b2e5ecf31aed1de7e5d50cb2f22e5a3dfc0a6632984388d"} Oct 03 14:21:16 crc kubenswrapper[4636]: I1003 14:21:16.390134 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6522439-da96-4e38-bf00-55a0fe9440e5","Type":"ContainerStarted","Data":"34e39474bac0de2fc8fe1458878ee903fc23478edd3422d7d7891660333908dd"} Oct 03 14:21:17 crc kubenswrapper[4636]: I1003 14:21:17.694129 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7976d47688-kx5v5" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 03 14:21:17 crc kubenswrapper[4636]: I1003 14:21:17.975869 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8c5bc9456-rfvns" podUID="0025da7c-17f3-4036-a9fc-3330508c11cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Oct 03 14:21:18 crc kubenswrapper[4636]: I1003 14:21:18.414016 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6522439-da96-4e38-bf00-55a0fe9440e5","Type":"ContainerStarted","Data":"6443c490dbc0d6baf480e964ad3e49492055811128ff8927a107f50db112f742"} Oct 03 14:21:18 crc kubenswrapper[4636]: I1003 14:21:18.929519 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 14:21:18 crc kubenswrapper[4636]: I1003 14:21:18.929841 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 14:21:18 crc kubenswrapper[4636]: I1003 14:21:18.970538 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 14:21:19 crc kubenswrapper[4636]: I1003 14:21:19.007952 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 14:21:19 crc kubenswrapper[4636]: I1003 14:21:19.431376 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6522439-da96-4e38-bf00-55a0fe9440e5","Type":"ContainerStarted","Data":"c294a4bd6fdd37fa9663cd4e4f535379cc154a64b4cc9956b1e2e0a63223ed39"} Oct 03 14:21:19 crc kubenswrapper[4636]: I1003 14:21:19.431803 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 14:21:19 crc kubenswrapper[4636]: I1003 14:21:19.431852 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 14:21:19 crc kubenswrapper[4636]: I1003 14:21:19.432185 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="ceilometer-central-agent" containerID="cri-o://9227ef7e4c6e33e2458127568b9b034881cc76de197f93a7916ff8ecc5392bfb" gracePeriod=30 Oct 03 14:21:19 crc kubenswrapper[4636]: I1003 14:21:19.432481 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="proxy-httpd" containerID="cri-o://c294a4bd6fdd37fa9663cd4e4f535379cc154a64b4cc9956b1e2e0a63223ed39" gracePeriod=30 Oct 03 14:21:19 crc kubenswrapper[4636]: I1003 14:21:19.432527 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="ceilometer-notification-agent" containerID="cri-o://34e39474bac0de2fc8fe1458878ee903fc23478edd3422d7d7891660333908dd" gracePeriod=30 Oct 03 14:21:19 crc kubenswrapper[4636]: I1003 14:21:19.432598 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="sg-core" containerID="cri-o://6443c490dbc0d6baf480e964ad3e49492055811128ff8927a107f50db112f742" gracePeriod=30 Oct 03 14:21:19 crc kubenswrapper[4636]: I1003 14:21:19.467786 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.168299574 podStartE2EDuration="6.467766102s" podCreationTimestamp="2025-10-03 14:21:13 +0000 UTC" firstStartedPulling="2025-10-03 14:21:14.429966654 +0000 UTC m=+1224.288692901" lastFinishedPulling="2025-10-03 14:21:18.729433182 +0000 UTC m=+1228.588159429" observedRunningTime="2025-10-03 14:21:19.456045768 +0000 UTC m=+1229.314772025" watchObservedRunningTime="2025-10-03 14:21:19.467766102 +0000 UTC m=+1229.326492349" Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.025483 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.025779 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.080572 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.109576 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.444170 4636 generic.go:334] "Generic (PLEG): container finished" podID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerID="c294a4bd6fdd37fa9663cd4e4f535379cc154a64b4cc9956b1e2e0a63223ed39" exitCode=0 Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.444213 4636 generic.go:334] "Generic (PLEG): container finished" podID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerID="6443c490dbc0d6baf480e964ad3e49492055811128ff8927a107f50db112f742" exitCode=2 Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.444226 4636 generic.go:334] "Generic (PLEG): container finished" podID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerID="34e39474bac0de2fc8fe1458878ee903fc23478edd3422d7d7891660333908dd" exitCode=0 Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.444287 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6522439-da96-4e38-bf00-55a0fe9440e5","Type":"ContainerDied","Data":"c294a4bd6fdd37fa9663cd4e4f535379cc154a64b4cc9956b1e2e0a63223ed39"} Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.444351 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6522439-da96-4e38-bf00-55a0fe9440e5","Type":"ContainerDied","Data":"6443c490dbc0d6baf480e964ad3e49492055811128ff8927a107f50db112f742"} Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.444383 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6522439-da96-4e38-bf00-55a0fe9440e5","Type":"ContainerDied","Data":"34e39474bac0de2fc8fe1458878ee903fc23478edd3422d7d7891660333908dd"} Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.444946 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:20 crc kubenswrapper[4636]: I1003 14:21:20.444982 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:22 crc kubenswrapper[4636]: I1003 14:21:22.055760 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 14:21:22 crc kubenswrapper[4636]: I1003 14:21:22.056226 4636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:21:22 crc kubenswrapper[4636]: I1003 14:21:22.141727 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 14:21:22 crc kubenswrapper[4636]: I1003 14:21:22.694744 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:22 crc kubenswrapper[4636]: I1003 14:21:22.695058 4636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 14:21:22 crc kubenswrapper[4636]: I1003 14:21:22.703369 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.604182 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rbdk4"] Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.606305 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rbdk4" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.651426 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rbdk4"] Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.667702 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-sfqrs"] Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.669019 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sfqrs" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.674052 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-sfqrs"] Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.727271 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xd85\" (UniqueName: \"kubernetes.io/projected/f762c138-feda-4a6d-8d07-dfcbb5efaf4d-kube-api-access-8xd85\") pod \"nova-api-db-create-rbdk4\" (UID: \"f762c138-feda-4a6d-8d07-dfcbb5efaf4d\") " pod="openstack/nova-api-db-create-rbdk4" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.727428 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwm47\" (UniqueName: \"kubernetes.io/projected/79e8d634-a8e9-43d5-ac01-e640cc209af3-kube-api-access-dwm47\") pod \"nova-cell0-db-create-sfqrs\" (UID: \"79e8d634-a8e9-43d5-ac01-e640cc209af3\") " pod="openstack/nova-cell0-db-create-sfqrs" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.753276 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-fj8lz"] Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.754512 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fj8lz" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.782035 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fj8lz"] Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.830711 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnrs4\" (UniqueName: \"kubernetes.io/projected/7f746edf-871e-487f-96f0-d640ee2e9266-kube-api-access-bnrs4\") pod \"nova-cell1-db-create-fj8lz\" (UID: \"7f746edf-871e-487f-96f0-d640ee2e9266\") " pod="openstack/nova-cell1-db-create-fj8lz" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.830798 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xd85\" (UniqueName: \"kubernetes.io/projected/f762c138-feda-4a6d-8d07-dfcbb5efaf4d-kube-api-access-8xd85\") pod \"nova-api-db-create-rbdk4\" (UID: \"f762c138-feda-4a6d-8d07-dfcbb5efaf4d\") " pod="openstack/nova-api-db-create-rbdk4" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.830884 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwm47\" (UniqueName: \"kubernetes.io/projected/79e8d634-a8e9-43d5-ac01-e640cc209af3-kube-api-access-dwm47\") pod \"nova-cell0-db-create-sfqrs\" (UID: \"79e8d634-a8e9-43d5-ac01-e640cc209af3\") " pod="openstack/nova-cell0-db-create-sfqrs" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.849651 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xd85\" (UniqueName: \"kubernetes.io/projected/f762c138-feda-4a6d-8d07-dfcbb5efaf4d-kube-api-access-8xd85\") pod \"nova-api-db-create-rbdk4\" (UID: \"f762c138-feda-4a6d-8d07-dfcbb5efaf4d\") " pod="openstack/nova-api-db-create-rbdk4" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.850486 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwm47\" (UniqueName: \"kubernetes.io/projected/79e8d634-a8e9-43d5-ac01-e640cc209af3-kube-api-access-dwm47\") pod \"nova-cell0-db-create-sfqrs\" (UID: \"79e8d634-a8e9-43d5-ac01-e640cc209af3\") " pod="openstack/nova-cell0-db-create-sfqrs" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.932464 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnrs4\" (UniqueName: \"kubernetes.io/projected/7f746edf-871e-487f-96f0-d640ee2e9266-kube-api-access-bnrs4\") pod \"nova-cell1-db-create-fj8lz\" (UID: \"7f746edf-871e-487f-96f0-d640ee2e9266\") " pod="openstack/nova-cell1-db-create-fj8lz" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.948504 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rbdk4" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.952474 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnrs4\" (UniqueName: \"kubernetes.io/projected/7f746edf-871e-487f-96f0-d640ee2e9266-kube-api-access-bnrs4\") pod \"nova-cell1-db-create-fj8lz\" (UID: \"7f746edf-871e-487f-96f0-d640ee2e9266\") " pod="openstack/nova-cell1-db-create-fj8lz" Oct 03 14:21:24 crc kubenswrapper[4636]: I1003 14:21:24.995217 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sfqrs" Oct 03 14:21:25 crc kubenswrapper[4636]: I1003 14:21:25.108060 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fj8lz" Oct 03 14:21:25 crc kubenswrapper[4636]: I1003 14:21:25.413735 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rbdk4"] Oct 03 14:21:25 crc kubenswrapper[4636]: W1003 14:21:25.432956 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf762c138_feda_4a6d_8d07_dfcbb5efaf4d.slice/crio-fb99d7b99456baf4222c6d25af781559652ce9ac8c4d17a3458ba45b03eb4a07 WatchSource:0}: Error finding container fb99d7b99456baf4222c6d25af781559652ce9ac8c4d17a3458ba45b03eb4a07: Status 404 returned error can't find the container with id fb99d7b99456baf4222c6d25af781559652ce9ac8c4d17a3458ba45b03eb4a07 Oct 03 14:21:25 crc kubenswrapper[4636]: I1003 14:21:25.489753 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fj8lz"] Oct 03 14:21:25 crc kubenswrapper[4636]: W1003 14:21:25.492381 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f746edf_871e_487f_96f0_d640ee2e9266.slice/crio-0bc1a38073d1a37ed0e9e3745e50eb47b86241258cf4201c4ae538a322a596f0 WatchSource:0}: Error finding container 0bc1a38073d1a37ed0e9e3745e50eb47b86241258cf4201c4ae538a322a596f0: Status 404 returned error can't find the container with id 0bc1a38073d1a37ed0e9e3745e50eb47b86241258cf4201c4ae538a322a596f0 Oct 03 14:21:25 crc kubenswrapper[4636]: I1003 14:21:25.496349 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rbdk4" event={"ID":"f762c138-feda-4a6d-8d07-dfcbb5efaf4d","Type":"ContainerStarted","Data":"fb99d7b99456baf4222c6d25af781559652ce9ac8c4d17a3458ba45b03eb4a07"} Oct 03 14:21:25 crc kubenswrapper[4636]: I1003 14:21:25.582970 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-sfqrs"] Oct 03 14:21:25 crc kubenswrapper[4636]: W1003 14:21:25.593134 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79e8d634_a8e9_43d5_ac01_e640cc209af3.slice/crio-bdbef81c06207b04ea590384e9c9912dcb8789a2b58d93a41d103d0de711728b WatchSource:0}: Error finding container bdbef81c06207b04ea590384e9c9912dcb8789a2b58d93a41d103d0de711728b: Status 404 returned error can't find the container with id bdbef81c06207b04ea590384e9c9912dcb8789a2b58d93a41d103d0de711728b Oct 03 14:21:26 crc kubenswrapper[4636]: I1003 14:21:26.506220 4636 generic.go:334] "Generic (PLEG): container finished" podID="f762c138-feda-4a6d-8d07-dfcbb5efaf4d" containerID="c69218f29980ade8c4fee05ca8263c09b390f7eb796d9667a66e6a558c52d546" exitCode=0 Oct 03 14:21:26 crc kubenswrapper[4636]: I1003 14:21:26.506541 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rbdk4" event={"ID":"f762c138-feda-4a6d-8d07-dfcbb5efaf4d","Type":"ContainerDied","Data":"c69218f29980ade8c4fee05ca8263c09b390f7eb796d9667a66e6a558c52d546"} Oct 03 14:21:26 crc kubenswrapper[4636]: I1003 14:21:26.508962 4636 generic.go:334] "Generic (PLEG): container finished" podID="79e8d634-a8e9-43d5-ac01-e640cc209af3" containerID="b3d48089cf8ab9fa3275569fa4cf6f9bffbac37c67b2e708d5b5cf276d0eb1f5" exitCode=0 Oct 03 14:21:26 crc kubenswrapper[4636]: I1003 14:21:26.509034 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sfqrs" event={"ID":"79e8d634-a8e9-43d5-ac01-e640cc209af3","Type":"ContainerDied","Data":"b3d48089cf8ab9fa3275569fa4cf6f9bffbac37c67b2e708d5b5cf276d0eb1f5"} Oct 03 14:21:26 crc kubenswrapper[4636]: I1003 14:21:26.509055 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sfqrs" event={"ID":"79e8d634-a8e9-43d5-ac01-e640cc209af3","Type":"ContainerStarted","Data":"bdbef81c06207b04ea590384e9c9912dcb8789a2b58d93a41d103d0de711728b"} Oct 03 14:21:26 crc kubenswrapper[4636]: I1003 14:21:26.511858 4636 generic.go:334] "Generic (PLEG): container finished" podID="7f746edf-871e-487f-96f0-d640ee2e9266" containerID="f5b92446f305154875fc571b2ce39d3bb9cd0c0963f22bc9fed2395e1c63753d" exitCode=0 Oct 03 14:21:26 crc kubenswrapper[4636]: I1003 14:21:26.511906 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fj8lz" event={"ID":"7f746edf-871e-487f-96f0-d640ee2e9266","Type":"ContainerDied","Data":"f5b92446f305154875fc571b2ce39d3bb9cd0c0963f22bc9fed2395e1c63753d"} Oct 03 14:21:26 crc kubenswrapper[4636]: I1003 14:21:26.511934 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fj8lz" event={"ID":"7f746edf-871e-487f-96f0-d640ee2e9266","Type":"ContainerStarted","Data":"0bc1a38073d1a37ed0e9e3745e50eb47b86241258cf4201c4ae538a322a596f0"} Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.052519 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rbdk4" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.058996 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fj8lz" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.066760 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sfqrs" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.203835 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwm47\" (UniqueName: \"kubernetes.io/projected/79e8d634-a8e9-43d5-ac01-e640cc209af3-kube-api-access-dwm47\") pod \"79e8d634-a8e9-43d5-ac01-e640cc209af3\" (UID: \"79e8d634-a8e9-43d5-ac01-e640cc209af3\") " Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.204240 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xd85\" (UniqueName: \"kubernetes.io/projected/f762c138-feda-4a6d-8d07-dfcbb5efaf4d-kube-api-access-8xd85\") pod \"f762c138-feda-4a6d-8d07-dfcbb5efaf4d\" (UID: \"f762c138-feda-4a6d-8d07-dfcbb5efaf4d\") " Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.204311 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnrs4\" (UniqueName: \"kubernetes.io/projected/7f746edf-871e-487f-96f0-d640ee2e9266-kube-api-access-bnrs4\") pod \"7f746edf-871e-487f-96f0-d640ee2e9266\" (UID: \"7f746edf-871e-487f-96f0-d640ee2e9266\") " Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.209630 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f762c138-feda-4a6d-8d07-dfcbb5efaf4d-kube-api-access-8xd85" (OuterVolumeSpecName: "kube-api-access-8xd85") pod "f762c138-feda-4a6d-8d07-dfcbb5efaf4d" (UID: "f762c138-feda-4a6d-8d07-dfcbb5efaf4d"). InnerVolumeSpecName "kube-api-access-8xd85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.210293 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f746edf-871e-487f-96f0-d640ee2e9266-kube-api-access-bnrs4" (OuterVolumeSpecName: "kube-api-access-bnrs4") pod "7f746edf-871e-487f-96f0-d640ee2e9266" (UID: "7f746edf-871e-487f-96f0-d640ee2e9266"). InnerVolumeSpecName "kube-api-access-bnrs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.212448 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e8d634-a8e9-43d5-ac01-e640cc209af3-kube-api-access-dwm47" (OuterVolumeSpecName: "kube-api-access-dwm47") pod "79e8d634-a8e9-43d5-ac01-e640cc209af3" (UID: "79e8d634-a8e9-43d5-ac01-e640cc209af3"). InnerVolumeSpecName "kube-api-access-dwm47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.306723 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwm47\" (UniqueName: \"kubernetes.io/projected/79e8d634-a8e9-43d5-ac01-e640cc209af3-kube-api-access-dwm47\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.306759 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xd85\" (UniqueName: \"kubernetes.io/projected/f762c138-feda-4a6d-8d07-dfcbb5efaf4d-kube-api-access-8xd85\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.306770 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnrs4\" (UniqueName: \"kubernetes.io/projected/7f746edf-871e-487f-96f0-d640ee2e9266-kube-api-access-bnrs4\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.529359 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rbdk4" event={"ID":"f762c138-feda-4a6d-8d07-dfcbb5efaf4d","Type":"ContainerDied","Data":"fb99d7b99456baf4222c6d25af781559652ce9ac8c4d17a3458ba45b03eb4a07"} Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.529510 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb99d7b99456baf4222c6d25af781559652ce9ac8c4d17a3458ba45b03eb4a07" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.529718 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rbdk4" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.531459 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sfqrs" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.531455 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sfqrs" event={"ID":"79e8d634-a8e9-43d5-ac01-e640cc209af3","Type":"ContainerDied","Data":"bdbef81c06207b04ea590384e9c9912dcb8789a2b58d93a41d103d0de711728b"} Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.531645 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdbef81c06207b04ea590384e9c9912dcb8789a2b58d93a41d103d0de711728b" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.533230 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fj8lz" event={"ID":"7f746edf-871e-487f-96f0-d640ee2e9266","Type":"ContainerDied","Data":"0bc1a38073d1a37ed0e9e3745e50eb47b86241258cf4201c4ae538a322a596f0"} Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.533276 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc1a38073d1a37ed0e9e3745e50eb47b86241258cf4201c4ae538a322a596f0" Oct 03 14:21:28 crc kubenswrapper[4636]: I1003 14:21:28.533279 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fj8lz" Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.345358 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.448589 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.556593 4636 generic.go:334] "Generic (PLEG): container finished" podID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerID="9227ef7e4c6e33e2458127568b9b034881cc76de197f93a7916ff8ecc5392bfb" exitCode=0 Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.556659 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6522439-da96-4e38-bf00-55a0fe9440e5","Type":"ContainerDied","Data":"9227ef7e4c6e33e2458127568b9b034881cc76de197f93a7916ff8ecc5392bfb"} Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.813598 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.957977 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-log-httpd\") pod \"c6522439-da96-4e38-bf00-55a0fe9440e5\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.958027 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7f7s\" (UniqueName: \"kubernetes.io/projected/c6522439-da96-4e38-bf00-55a0fe9440e5-kube-api-access-r7f7s\") pod \"c6522439-da96-4e38-bf00-55a0fe9440e5\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.958173 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-scripts\") pod \"c6522439-da96-4e38-bf00-55a0fe9440e5\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.958249 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-ceilometer-tls-certs\") pod \"c6522439-da96-4e38-bf00-55a0fe9440e5\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.958313 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-sg-core-conf-yaml\") pod \"c6522439-da96-4e38-bf00-55a0fe9440e5\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.958339 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-combined-ca-bundle\") pod \"c6522439-da96-4e38-bf00-55a0fe9440e5\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.958361 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-config-data\") pod \"c6522439-da96-4e38-bf00-55a0fe9440e5\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.959078 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-run-httpd\") pod \"c6522439-da96-4e38-bf00-55a0fe9440e5\" (UID: \"c6522439-da96-4e38-bf00-55a0fe9440e5\") " Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.958763 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c6522439-da96-4e38-bf00-55a0fe9440e5" (UID: "c6522439-da96-4e38-bf00-55a0fe9440e5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.959670 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c6522439-da96-4e38-bf00-55a0fe9440e5" (UID: "c6522439-da96-4e38-bf00-55a0fe9440e5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.963563 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6522439-da96-4e38-bf00-55a0fe9440e5-kube-api-access-r7f7s" (OuterVolumeSpecName: "kube-api-access-r7f7s") pod "c6522439-da96-4e38-bf00-55a0fe9440e5" (UID: "c6522439-da96-4e38-bf00-55a0fe9440e5"). InnerVolumeSpecName "kube-api-access-r7f7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:30 crc kubenswrapper[4636]: I1003 14:21:30.963880 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-scripts" (OuterVolumeSpecName: "scripts") pod "c6522439-da96-4e38-bf00-55a0fe9440e5" (UID: "c6522439-da96-4e38-bf00-55a0fe9440e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.026023 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c6522439-da96-4e38-bf00-55a0fe9440e5" (UID: "c6522439-da96-4e38-bf00-55a0fe9440e5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.031889 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c6522439-da96-4e38-bf00-55a0fe9440e5" (UID: "c6522439-da96-4e38-bf00-55a0fe9440e5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.052878 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6522439-da96-4e38-bf00-55a0fe9440e5" (UID: "c6522439-da96-4e38-bf00-55a0fe9440e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.062230 4636 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.062432 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.062517 4636 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.062612 4636 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6522439-da96-4e38-bf00-55a0fe9440e5-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.062670 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7f7s\" (UniqueName: \"kubernetes.io/projected/c6522439-da96-4e38-bf00-55a0fe9440e5-kube-api-access-r7f7s\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.062742 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.062804 4636 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.110386 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-config-data" (OuterVolumeSpecName: "config-data") pod "c6522439-da96-4e38-bf00-55a0fe9440e5" (UID: "c6522439-da96-4e38-bf00-55a0fe9440e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.164084 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6522439-da96-4e38-bf00-55a0fe9440e5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.568041 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6522439-da96-4e38-bf00-55a0fe9440e5","Type":"ContainerDied","Data":"e02e679efcf1e0214b2e5ecf31aed1de7e5d50cb2f22e5a3dfc0a6632984388d"} Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.568153 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.568414 4636 scope.go:117] "RemoveContainer" containerID="c294a4bd6fdd37fa9663cd4e4f535379cc154a64b4cc9956b1e2e0a63223ed39" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.589106 4636 scope.go:117] "RemoveContainer" containerID="6443c490dbc0d6baf480e964ad3e49492055811128ff8927a107f50db112f742" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.604150 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.614473 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.632981 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:31 crc kubenswrapper[4636]: E1003 14:21:31.633327 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="sg-core" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633344 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="sg-core" Oct 03 14:21:31 crc kubenswrapper[4636]: E1003 14:21:31.633357 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="ceilometer-central-agent" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633366 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="ceilometer-central-agent" Oct 03 14:21:31 crc kubenswrapper[4636]: E1003 14:21:31.633377 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e8d634-a8e9-43d5-ac01-e640cc209af3" containerName="mariadb-database-create" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633383 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e8d634-a8e9-43d5-ac01-e640cc209af3" containerName="mariadb-database-create" Oct 03 14:21:31 crc kubenswrapper[4636]: E1003 14:21:31.633396 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="ceilometer-notification-agent" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633401 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="ceilometer-notification-agent" Oct 03 14:21:31 crc kubenswrapper[4636]: E1003 14:21:31.633416 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f746edf-871e-487f-96f0-d640ee2e9266" containerName="mariadb-database-create" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633423 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f746edf-871e-487f-96f0-d640ee2e9266" containerName="mariadb-database-create" Oct 03 14:21:31 crc kubenswrapper[4636]: E1003 14:21:31.633447 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="proxy-httpd" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633452 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="proxy-httpd" Oct 03 14:21:31 crc kubenswrapper[4636]: E1003 14:21:31.633464 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f762c138-feda-4a6d-8d07-dfcbb5efaf4d" containerName="mariadb-database-create" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633470 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="f762c138-feda-4a6d-8d07-dfcbb5efaf4d" containerName="mariadb-database-create" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633630 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="f762c138-feda-4a6d-8d07-dfcbb5efaf4d" containerName="mariadb-database-create" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633643 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="ceilometer-notification-agent" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633652 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e8d634-a8e9-43d5-ac01-e640cc209af3" containerName="mariadb-database-create" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633665 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f746edf-871e-487f-96f0-d640ee2e9266" containerName="mariadb-database-create" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633676 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="sg-core" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633684 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="ceilometer-central-agent" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.633695 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" containerName="proxy-httpd" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.635294 4636 scope.go:117] "RemoveContainer" containerID="34e39474bac0de2fc8fe1458878ee903fc23478edd3422d7d7891660333908dd" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.639304 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.642014 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.642633 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.642847 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.665057 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.683312 4636 scope.go:117] "RemoveContainer" containerID="9227ef7e4c6e33e2458127568b9b034881cc76de197f93a7916ff8ecc5392bfb" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.775836 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-scripts\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.775929 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-log-httpd\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.776206 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-run-httpd\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.776239 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-config-data\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.776276 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7f8s\" (UniqueName: \"kubernetes.io/projected/fd34364d-ad77-448b-958b-95ba901dd4e4-kube-api-access-r7f8s\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.776328 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.776415 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.776552 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.877666 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-scripts\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.877756 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-log-httpd\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.877807 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-run-httpd\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.877827 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-config-data\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.877854 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7f8s\" (UniqueName: \"kubernetes.io/projected/fd34364d-ad77-448b-958b-95ba901dd4e4-kube-api-access-r7f8s\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.877901 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.877932 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.877980 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.878862 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-log-httpd\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.878914 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-run-httpd\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.883451 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-config-data\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.884335 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-scripts\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.886679 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.887202 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.893731 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.896136 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7f8s\" (UniqueName: \"kubernetes.io/projected/fd34364d-ad77-448b-958b-95ba901dd4e4-kube-api-access-r7f8s\") pod \"ceilometer-0\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " pod="openstack/ceilometer-0" Oct 03 14:21:31 crc kubenswrapper[4636]: I1003 14:21:31.972000 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:32 crc kubenswrapper[4636]: I1003 14:21:32.416439 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:21:32 crc kubenswrapper[4636]: I1003 14:21:32.513718 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:32 crc kubenswrapper[4636]: W1003 14:21:32.524337 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd34364d_ad77_448b_958b_95ba901dd4e4.slice/crio-fc6f0112147a301d12a2447846b354ecb8fa8843e8e20e60ddfc9ac3d410085f WatchSource:0}: Error finding container fc6f0112147a301d12a2447846b354ecb8fa8843e8e20e60ddfc9ac3d410085f: Status 404 returned error can't find the container with id fc6f0112147a301d12a2447846b354ecb8fa8843e8e20e60ddfc9ac3d410085f Oct 03 14:21:32 crc kubenswrapper[4636]: I1003 14:21:32.599312 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd34364d-ad77-448b-958b-95ba901dd4e4","Type":"ContainerStarted","Data":"fc6f0112147a301d12a2447846b354ecb8fa8843e8e20e60ddfc9ac3d410085f"} Oct 03 14:21:32 crc kubenswrapper[4636]: I1003 14:21:32.611893 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8c5bc9456-rfvns" Oct 03 14:21:32 crc kubenswrapper[4636]: I1003 14:21:32.690694 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7976d47688-kx5v5"] Oct 03 14:21:32 crc kubenswrapper[4636]: I1003 14:21:32.690901 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7976d47688-kx5v5" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon-log" containerID="cri-o://f07e34e6b6d7315b30759822fbb4041e8a844861251cba7541a6632092e00e7f" gracePeriod=30 Oct 03 14:21:32 crc kubenswrapper[4636]: I1003 14:21:32.691421 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7976d47688-kx5v5" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" containerID="cri-o://04ea3c8403ac5e4db0992d3e740398054c3ed5e4c5fb3d90d47fa1626284e1c7" gracePeriod=30 Oct 03 14:21:32 crc kubenswrapper[4636]: I1003 14:21:32.805757 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6522439-da96-4e38-bf00-55a0fe9440e5" path="/var/lib/kubelet/pods/c6522439-da96-4e38-bf00-55a0fe9440e5/volumes" Oct 03 14:21:33 crc kubenswrapper[4636]: I1003 14:21:33.610611 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd34364d-ad77-448b-958b-95ba901dd4e4","Type":"ContainerStarted","Data":"d2d5aaa2320034de18acbb11c3dd24e37ee46daea1180ce481e452f800a78c3a"} Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.621985 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd34364d-ad77-448b-958b-95ba901dd4e4","Type":"ContainerStarted","Data":"2622436e2328447c8b9197238d72a3bb081063b6f798fd451fa1bbadcbb52b8f"} Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.622383 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd34364d-ad77-448b-958b-95ba901dd4e4","Type":"ContainerStarted","Data":"7574bf11dca3cc98902d1c641bba262051ce8b6c6327da1929b4bab419164808"} Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.696246 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3116-account-create-hc7k2"] Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.697539 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3116-account-create-hc7k2" Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.699789 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.718422 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3116-account-create-hc7k2"] Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.835710 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfhqm\" (UniqueName: \"kubernetes.io/projected/1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4-kube-api-access-pfhqm\") pod \"nova-api-3116-account-create-hc7k2\" (UID: \"1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4\") " pod="openstack/nova-api-3116-account-create-hc7k2" Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.898371 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cc4c-account-create-2g6t4"] Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.899778 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc4c-account-create-2g6t4" Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.902744 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.913320 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cc4c-account-create-2g6t4"] Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.937303 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfhqm\" (UniqueName: \"kubernetes.io/projected/1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4-kube-api-access-pfhqm\") pod \"nova-api-3116-account-create-hc7k2\" (UID: \"1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4\") " pod="openstack/nova-api-3116-account-create-hc7k2" Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.937360 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc94n\" (UniqueName: \"kubernetes.io/projected/549b8746-9d8a-4f99-82b7-e03650acb897-kube-api-access-hc94n\") pod \"nova-cell0-cc4c-account-create-2g6t4\" (UID: \"549b8746-9d8a-4f99-82b7-e03650acb897\") " pod="openstack/nova-cell0-cc4c-account-create-2g6t4" Oct 03 14:21:34 crc kubenswrapper[4636]: I1003 14:21:34.959073 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfhqm\" (UniqueName: \"kubernetes.io/projected/1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4-kube-api-access-pfhqm\") pod \"nova-api-3116-account-create-hc7k2\" (UID: \"1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4\") " pod="openstack/nova-api-3116-account-create-hc7k2" Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.017396 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3116-account-create-hc7k2" Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.039416 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc94n\" (UniqueName: \"kubernetes.io/projected/549b8746-9d8a-4f99-82b7-e03650acb897-kube-api-access-hc94n\") pod \"nova-cell0-cc4c-account-create-2g6t4\" (UID: \"549b8746-9d8a-4f99-82b7-e03650acb897\") " pod="openstack/nova-cell0-cc4c-account-create-2g6t4" Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.056561 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc94n\" (UniqueName: \"kubernetes.io/projected/549b8746-9d8a-4f99-82b7-e03650acb897-kube-api-access-hc94n\") pod \"nova-cell0-cc4c-account-create-2g6t4\" (UID: \"549b8746-9d8a-4f99-82b7-e03650acb897\") " pod="openstack/nova-cell0-cc4c-account-create-2g6t4" Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.103398 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c311-account-create-s5q7f"] Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.104806 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c311-account-create-s5q7f" Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.107199 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.110401 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c311-account-create-s5q7f"] Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.143716 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdfgs\" (UniqueName: \"kubernetes.io/projected/1dcc76fa-3c4f-4196-b6b8-3add9559c134-kube-api-access-pdfgs\") pod \"nova-cell1-c311-account-create-s5q7f\" (UID: \"1dcc76fa-3c4f-4196-b6b8-3add9559c134\") " pod="openstack/nova-cell1-c311-account-create-s5q7f" Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.217556 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc4c-account-create-2g6t4" Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.248504 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdfgs\" (UniqueName: \"kubernetes.io/projected/1dcc76fa-3c4f-4196-b6b8-3add9559c134-kube-api-access-pdfgs\") pod \"nova-cell1-c311-account-create-s5q7f\" (UID: \"1dcc76fa-3c4f-4196-b6b8-3add9559c134\") " pod="openstack/nova-cell1-c311-account-create-s5q7f" Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.266946 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdfgs\" (UniqueName: \"kubernetes.io/projected/1dcc76fa-3c4f-4196-b6b8-3add9559c134-kube-api-access-pdfgs\") pod \"nova-cell1-c311-account-create-s5q7f\" (UID: \"1dcc76fa-3c4f-4196-b6b8-3add9559c134\") " pod="openstack/nova-cell1-c311-account-create-s5q7f" Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.462163 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c311-account-create-s5q7f" Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.508969 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3116-account-create-hc7k2"] Oct 03 14:21:35 crc kubenswrapper[4636]: W1003 14:21:35.514669 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ddc23ff_72a1_4527_864e_0cd4fd4b3cb4.slice/crio-6909cca70298adb9cab1add7cd9fffa4171bc08ac2729066320defa2c71f1a3f WatchSource:0}: Error finding container 6909cca70298adb9cab1add7cd9fffa4171bc08ac2729066320defa2c71f1a3f: Status 404 returned error can't find the container with id 6909cca70298adb9cab1add7cd9fffa4171bc08ac2729066320defa2c71f1a3f Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.668486 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3116-account-create-hc7k2" event={"ID":"1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4","Type":"ContainerStarted","Data":"6909cca70298adb9cab1add7cd9fffa4171bc08ac2729066320defa2c71f1a3f"} Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.675122 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cc4c-account-create-2g6t4"] Oct 03 14:21:35 crc kubenswrapper[4636]: I1003 14:21:35.958378 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c311-account-create-s5q7f"] Oct 03 14:21:35 crc kubenswrapper[4636]: W1003 14:21:35.984449 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dcc76fa_3c4f_4196_b6b8_3add9559c134.slice/crio-03867f89fea5e5a254064a45369332230d9f33ce2f01c61881436c5401368156 WatchSource:0}: Error finding container 03867f89fea5e5a254064a45369332230d9f33ce2f01c61881436c5401368156: Status 404 returned error can't find the container with id 03867f89fea5e5a254064a45369332230d9f33ce2f01c61881436c5401368156 Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.689300 4636 generic.go:334] "Generic (PLEG): container finished" podID="1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4" containerID="62e23a8427ddb727a81b774a9ad6a01057905ac1eda53c868bbf61793a4ec9b2" exitCode=0 Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.689387 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3116-account-create-hc7k2" event={"ID":"1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4","Type":"ContainerDied","Data":"62e23a8427ddb727a81b774a9ad6a01057905ac1eda53c868bbf61793a4ec9b2"} Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.692411 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd34364d-ad77-448b-958b-95ba901dd4e4","Type":"ContainerStarted","Data":"a8063e6656059c999dadc0c00c5a2215b09a6ad4b7af42b220ff8a6917069b53"} Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.692555 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.695662 4636 generic.go:334] "Generic (PLEG): container finished" podID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerID="04ea3c8403ac5e4db0992d3e740398054c3ed5e4c5fb3d90d47fa1626284e1c7" exitCode=0 Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.695725 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976d47688-kx5v5" event={"ID":"92ef2fa8-5e4e-49f1-8840-01b5be29d036","Type":"ContainerDied","Data":"04ea3c8403ac5e4db0992d3e740398054c3ed5e4c5fb3d90d47fa1626284e1c7"} Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.695785 4636 scope.go:117] "RemoveContainer" containerID="6f203755d3b7d2412b112d16f7778187f0cdb274206e2b3e4aaeccb274cae768" Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.697059 4636 generic.go:334] "Generic (PLEG): container finished" podID="1dcc76fa-3c4f-4196-b6b8-3add9559c134" containerID="06420344c54af020ec13a167ce7aa2de36ddaaf0ab59509bc6346fa8e6105a3b" exitCode=0 Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.697135 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c311-account-create-s5q7f" event={"ID":"1dcc76fa-3c4f-4196-b6b8-3add9559c134","Type":"ContainerDied","Data":"06420344c54af020ec13a167ce7aa2de36ddaaf0ab59509bc6346fa8e6105a3b"} Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.697161 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c311-account-create-s5q7f" event={"ID":"1dcc76fa-3c4f-4196-b6b8-3add9559c134","Type":"ContainerStarted","Data":"03867f89fea5e5a254064a45369332230d9f33ce2f01c61881436c5401368156"} Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.700566 4636 generic.go:334] "Generic (PLEG): container finished" podID="549b8746-9d8a-4f99-82b7-e03650acb897" containerID="a7bfdb1098cf857476b93f5a191c1c86c969d42dedaa18fe09b18e1ed16a0035" exitCode=0 Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.700605 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc4c-account-create-2g6t4" event={"ID":"549b8746-9d8a-4f99-82b7-e03650acb897","Type":"ContainerDied","Data":"a7bfdb1098cf857476b93f5a191c1c86c969d42dedaa18fe09b18e1ed16a0035"} Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.700626 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc4c-account-create-2g6t4" event={"ID":"549b8746-9d8a-4f99-82b7-e03650acb897","Type":"ContainerStarted","Data":"91512b022f52828efb35f44d7f97f2ef1959d681cb50604bde80453c3883025f"} Oct 03 14:21:36 crc kubenswrapper[4636]: I1003 14:21:36.724710 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.598646574 podStartE2EDuration="5.724687396s" podCreationTimestamp="2025-10-03 14:21:31 +0000 UTC" firstStartedPulling="2025-10-03 14:21:32.53879004 +0000 UTC m=+1242.397516287" lastFinishedPulling="2025-10-03 14:21:35.664830862 +0000 UTC m=+1245.523557109" observedRunningTime="2025-10-03 14:21:36.72163381 +0000 UTC m=+1246.580360067" watchObservedRunningTime="2025-10-03 14:21:36.724687396 +0000 UTC m=+1246.583413633" Oct 03 14:21:37 crc kubenswrapper[4636]: I1003 14:21:37.692855 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7976d47688-kx5v5" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.064755 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c311-account-create-s5q7f" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.210858 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdfgs\" (UniqueName: \"kubernetes.io/projected/1dcc76fa-3c4f-4196-b6b8-3add9559c134-kube-api-access-pdfgs\") pod \"1dcc76fa-3c4f-4196-b6b8-3add9559c134\" (UID: \"1dcc76fa-3c4f-4196-b6b8-3add9559c134\") " Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.220606 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc4c-account-create-2g6t4" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.222958 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dcc76fa-3c4f-4196-b6b8-3add9559c134-kube-api-access-pdfgs" (OuterVolumeSpecName: "kube-api-access-pdfgs") pod "1dcc76fa-3c4f-4196-b6b8-3add9559c134" (UID: "1dcc76fa-3c4f-4196-b6b8-3add9559c134"). InnerVolumeSpecName "kube-api-access-pdfgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.260952 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3116-account-create-hc7k2" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.312707 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfhqm\" (UniqueName: \"kubernetes.io/projected/1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4-kube-api-access-pfhqm\") pod \"1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4\" (UID: \"1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4\") " Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.313005 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc94n\" (UniqueName: \"kubernetes.io/projected/549b8746-9d8a-4f99-82b7-e03650acb897-kube-api-access-hc94n\") pod \"549b8746-9d8a-4f99-82b7-e03650acb897\" (UID: \"549b8746-9d8a-4f99-82b7-e03650acb897\") " Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.313482 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdfgs\" (UniqueName: \"kubernetes.io/projected/1dcc76fa-3c4f-4196-b6b8-3add9559c134-kube-api-access-pdfgs\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.315758 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4-kube-api-access-pfhqm" (OuterVolumeSpecName: "kube-api-access-pfhqm") pod "1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4" (UID: "1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4"). InnerVolumeSpecName "kube-api-access-pfhqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.316290 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549b8746-9d8a-4f99-82b7-e03650acb897-kube-api-access-hc94n" (OuterVolumeSpecName: "kube-api-access-hc94n") pod "549b8746-9d8a-4f99-82b7-e03650acb897" (UID: "549b8746-9d8a-4f99-82b7-e03650acb897"). InnerVolumeSpecName "kube-api-access-hc94n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.415864 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc94n\" (UniqueName: \"kubernetes.io/projected/549b8746-9d8a-4f99-82b7-e03650acb897-kube-api-access-hc94n\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.415903 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfhqm\" (UniqueName: \"kubernetes.io/projected/1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4-kube-api-access-pfhqm\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.724473 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3116-account-create-hc7k2" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.724431 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3116-account-create-hc7k2" event={"ID":"1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4","Type":"ContainerDied","Data":"6909cca70298adb9cab1add7cd9fffa4171bc08ac2729066320defa2c71f1a3f"} Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.724667 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6909cca70298adb9cab1add7cd9fffa4171bc08ac2729066320defa2c71f1a3f" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.726340 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c311-account-create-s5q7f" event={"ID":"1dcc76fa-3c4f-4196-b6b8-3add9559c134","Type":"ContainerDied","Data":"03867f89fea5e5a254064a45369332230d9f33ce2f01c61881436c5401368156"} Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.726371 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03867f89fea5e5a254064a45369332230d9f33ce2f01c61881436c5401368156" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.726348 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c311-account-create-s5q7f" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.728490 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc4c-account-create-2g6t4" event={"ID":"549b8746-9d8a-4f99-82b7-e03650acb897","Type":"ContainerDied","Data":"91512b022f52828efb35f44d7f97f2ef1959d681cb50604bde80453c3883025f"} Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.728535 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91512b022f52828efb35f44d7f97f2ef1959d681cb50604bde80453c3883025f" Oct 03 14:21:38 crc kubenswrapper[4636]: I1003 14:21:38.728556 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc4c-account-create-2g6t4" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.200778 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rw5t7"] Oct 03 14:21:40 crc kubenswrapper[4636]: E1003 14:21:40.209920 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4" containerName="mariadb-account-create" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.209943 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4" containerName="mariadb-account-create" Oct 03 14:21:40 crc kubenswrapper[4636]: E1003 14:21:40.209972 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcc76fa-3c4f-4196-b6b8-3add9559c134" containerName="mariadb-account-create" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.209986 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcc76fa-3c4f-4196-b6b8-3add9559c134" containerName="mariadb-account-create" Oct 03 14:21:40 crc kubenswrapper[4636]: E1003 14:21:40.210025 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549b8746-9d8a-4f99-82b7-e03650acb897" containerName="mariadb-account-create" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.210032 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="549b8746-9d8a-4f99-82b7-e03650acb897" containerName="mariadb-account-create" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.210381 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4" containerName="mariadb-account-create" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.210422 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="549b8746-9d8a-4f99-82b7-e03650acb897" containerName="mariadb-account-create" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.210448 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dcc76fa-3c4f-4196-b6b8-3add9559c134" containerName="mariadb-account-create" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.211296 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.220200 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.223987 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9xcsh" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.224188 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.271468 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rw5t7"] Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.353052 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-scripts\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.353191 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btxl2\" (UniqueName: \"kubernetes.io/projected/fb2e76a1-5457-484c-b311-c46b1eecec12-kube-api-access-btxl2\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.353213 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-config-data\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.353252 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.455015 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.455170 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-scripts\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.455256 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btxl2\" (UniqueName: \"kubernetes.io/projected/fb2e76a1-5457-484c-b311-c46b1eecec12-kube-api-access-btxl2\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.455277 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-config-data\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.475718 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-config-data\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.475900 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-scripts\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.476336 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.495829 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btxl2\" (UniqueName: \"kubernetes.io/projected/fb2e76a1-5457-484c-b311-c46b1eecec12-kube-api-access-btxl2\") pod \"nova-cell0-conductor-db-sync-rw5t7\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:40 crc kubenswrapper[4636]: I1003 14:21:40.572906 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:21:41 crc kubenswrapper[4636]: I1003 14:21:41.106302 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rw5t7"] Oct 03 14:21:41 crc kubenswrapper[4636]: I1003 14:21:41.130290 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:21:41 crc kubenswrapper[4636]: I1003 14:21:41.762050 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rw5t7" event={"ID":"fb2e76a1-5457-484c-b311-c46b1eecec12","Type":"ContainerStarted","Data":"3482dc5e368b5c92a867c048d17ff6d312c3cd1cde623f4eb43a8209b34f08a0"} Oct 03 14:21:43 crc kubenswrapper[4636]: I1003 14:21:43.602800 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:43 crc kubenswrapper[4636]: I1003 14:21:43.604995 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="ceilometer-central-agent" containerID="cri-o://d2d5aaa2320034de18acbb11c3dd24e37ee46daea1180ce481e452f800a78c3a" gracePeriod=30 Oct 03 14:21:43 crc kubenswrapper[4636]: I1003 14:21:43.605116 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="proxy-httpd" containerID="cri-o://a8063e6656059c999dadc0c00c5a2215b09a6ad4b7af42b220ff8a6917069b53" gracePeriod=30 Oct 03 14:21:43 crc kubenswrapper[4636]: I1003 14:21:43.605277 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="sg-core" containerID="cri-o://2622436e2328447c8b9197238d72a3bb081063b6f798fd451fa1bbadcbb52b8f" gracePeriod=30 Oct 03 14:21:43 crc kubenswrapper[4636]: I1003 14:21:43.605325 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="ceilometer-notification-agent" containerID="cri-o://7574bf11dca3cc98902d1c641bba262051ce8b6c6327da1929b4bab419164808" gracePeriod=30 Oct 03 14:21:44 crc kubenswrapper[4636]: I1003 14:21:44.798012 4636 generic.go:334] "Generic (PLEG): container finished" podID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerID="a8063e6656059c999dadc0c00c5a2215b09a6ad4b7af42b220ff8a6917069b53" exitCode=0 Oct 03 14:21:44 crc kubenswrapper[4636]: I1003 14:21:44.798411 4636 generic.go:334] "Generic (PLEG): container finished" podID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerID="2622436e2328447c8b9197238d72a3bb081063b6f798fd451fa1bbadcbb52b8f" exitCode=2 Oct 03 14:21:44 crc kubenswrapper[4636]: I1003 14:21:44.798424 4636 generic.go:334] "Generic (PLEG): container finished" podID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerID="7574bf11dca3cc98902d1c641bba262051ce8b6c6327da1929b4bab419164808" exitCode=0 Oct 03 14:21:44 crc kubenswrapper[4636]: I1003 14:21:44.805389 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd34364d-ad77-448b-958b-95ba901dd4e4","Type":"ContainerDied","Data":"a8063e6656059c999dadc0c00c5a2215b09a6ad4b7af42b220ff8a6917069b53"} Oct 03 14:21:44 crc kubenswrapper[4636]: I1003 14:21:44.805429 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd34364d-ad77-448b-958b-95ba901dd4e4","Type":"ContainerDied","Data":"2622436e2328447c8b9197238d72a3bb081063b6f798fd451fa1bbadcbb52b8f"} Oct 03 14:21:44 crc kubenswrapper[4636]: I1003 14:21:44.805441 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd34364d-ad77-448b-958b-95ba901dd4e4","Type":"ContainerDied","Data":"7574bf11dca3cc98902d1c641bba262051ce8b6c6327da1929b4bab419164808"} Oct 03 14:21:45 crc kubenswrapper[4636]: I1003 14:21:45.811452 4636 generic.go:334] "Generic (PLEG): container finished" podID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerID="d2d5aaa2320034de18acbb11c3dd24e37ee46daea1180ce481e452f800a78c3a" exitCode=0 Oct 03 14:21:45 crc kubenswrapper[4636]: I1003 14:21:45.811532 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd34364d-ad77-448b-958b-95ba901dd4e4","Type":"ContainerDied","Data":"d2d5aaa2320034de18acbb11c3dd24e37ee46daea1180ce481e452f800a78c3a"} Oct 03 14:21:47 crc kubenswrapper[4636]: I1003 14:21:47.692610 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7976d47688-kx5v5" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.509628 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.651260 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-log-httpd\") pod \"fd34364d-ad77-448b-958b-95ba901dd4e4\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.651349 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-combined-ca-bundle\") pod \"fd34364d-ad77-448b-958b-95ba901dd4e4\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.651380 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7f8s\" (UniqueName: \"kubernetes.io/projected/fd34364d-ad77-448b-958b-95ba901dd4e4-kube-api-access-r7f8s\") pod \"fd34364d-ad77-448b-958b-95ba901dd4e4\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.651405 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-ceilometer-tls-certs\") pod \"fd34364d-ad77-448b-958b-95ba901dd4e4\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.651455 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-run-httpd\") pod \"fd34364d-ad77-448b-958b-95ba901dd4e4\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.651610 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-scripts\") pod \"fd34364d-ad77-448b-958b-95ba901dd4e4\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.651675 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-sg-core-conf-yaml\") pod \"fd34364d-ad77-448b-958b-95ba901dd4e4\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.652412 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-config-data\") pod \"fd34364d-ad77-448b-958b-95ba901dd4e4\" (UID: \"fd34364d-ad77-448b-958b-95ba901dd4e4\") " Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.674423 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fd34364d-ad77-448b-958b-95ba901dd4e4" (UID: "fd34364d-ad77-448b-958b-95ba901dd4e4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.676292 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fd34364d-ad77-448b-958b-95ba901dd4e4" (UID: "fd34364d-ad77-448b-958b-95ba901dd4e4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.676693 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd34364d-ad77-448b-958b-95ba901dd4e4-kube-api-access-r7f8s" (OuterVolumeSpecName: "kube-api-access-r7f8s") pod "fd34364d-ad77-448b-958b-95ba901dd4e4" (UID: "fd34364d-ad77-448b-958b-95ba901dd4e4"). InnerVolumeSpecName "kube-api-access-r7f8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.680308 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-scripts" (OuterVolumeSpecName: "scripts") pod "fd34364d-ad77-448b-958b-95ba901dd4e4" (UID: "fd34364d-ad77-448b-958b-95ba901dd4e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.706399 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fd34364d-ad77-448b-958b-95ba901dd4e4" (UID: "fd34364d-ad77-448b-958b-95ba901dd4e4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.708643 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fd34364d-ad77-448b-958b-95ba901dd4e4" (UID: "fd34364d-ad77-448b-958b-95ba901dd4e4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.731088 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd34364d-ad77-448b-958b-95ba901dd4e4" (UID: "fd34364d-ad77-448b-958b-95ba901dd4e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.752080 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-config-data" (OuterVolumeSpecName: "config-data") pod "fd34364d-ad77-448b-958b-95ba901dd4e4" (UID: "fd34364d-ad77-448b-958b-95ba901dd4e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.755035 4636 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.755088 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.755150 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7f8s\" (UniqueName: \"kubernetes.io/projected/fd34364d-ad77-448b-958b-95ba901dd4e4-kube-api-access-r7f8s\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.755167 4636 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.755190 4636 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd34364d-ad77-448b-958b-95ba901dd4e4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.755201 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.755212 4636 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.755223 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd34364d-ad77-448b-958b-95ba901dd4e4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.851008 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd34364d-ad77-448b-958b-95ba901dd4e4","Type":"ContainerDied","Data":"fc6f0112147a301d12a2447846b354ecb8fa8843e8e20e60ddfc9ac3d410085f"} Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.851055 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.851071 4636 scope.go:117] "RemoveContainer" containerID="a8063e6656059c999dadc0c00c5a2215b09a6ad4b7af42b220ff8a6917069b53" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.899172 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.918326 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.929719 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:49 crc kubenswrapper[4636]: E1003 14:21:49.930309 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="sg-core" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.930323 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="sg-core" Oct 03 14:21:49 crc kubenswrapper[4636]: E1003 14:21:49.930347 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="ceilometer-notification-agent" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.930355 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="ceilometer-notification-agent" Oct 03 14:21:49 crc kubenswrapper[4636]: E1003 14:21:49.930376 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="proxy-httpd" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.930383 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="proxy-httpd" Oct 03 14:21:49 crc kubenswrapper[4636]: E1003 14:21:49.930394 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="ceilometer-central-agent" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.930401 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="ceilometer-central-agent" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.930584 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="ceilometer-notification-agent" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.930604 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="proxy-httpd" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.930628 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="ceilometer-central-agent" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.930641 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" containerName="sg-core" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.932438 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.932530 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.960458 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.960736 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 14:21:49 crc kubenswrapper[4636]: I1003 14:21:49.960856 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.060314 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.060613 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.060727 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-log-httpd\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.060838 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-scripts\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.060933 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wncbf\" (UniqueName: \"kubernetes.io/projected/03b9cd0b-5397-46c3-af28-f6e766fc596b-kube-api-access-wncbf\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.061067 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-run-httpd\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.061191 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-config-data\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.061301 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.162951 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-scripts\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.163006 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wncbf\" (UniqueName: \"kubernetes.io/projected/03b9cd0b-5397-46c3-af28-f6e766fc596b-kube-api-access-wncbf\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.163125 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-run-httpd\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.163178 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-config-data\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.163214 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.163241 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.163272 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.163328 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-log-httpd\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.164034 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-run-httpd\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.165043 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-log-httpd\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.167528 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-config-data\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.168963 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-scripts\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.169245 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.176514 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.177824 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.179937 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wncbf\" (UniqueName: \"kubernetes.io/projected/03b9cd0b-5397-46c3-af28-f6e766fc596b-kube-api-access-wncbf\") pod \"ceilometer-0\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.272523 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:21:50 crc kubenswrapper[4636]: I1003 14:21:50.808892 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd34364d-ad77-448b-958b-95ba901dd4e4" path="/var/lib/kubelet/pods/fd34364d-ad77-448b-958b-95ba901dd4e4/volumes" Oct 03 14:21:51 crc kubenswrapper[4636]: I1003 14:21:51.064016 4636 scope.go:117] "RemoveContainer" containerID="2622436e2328447c8b9197238d72a3bb081063b6f798fd451fa1bbadcbb52b8f" Oct 03 14:21:51 crc kubenswrapper[4636]: I1003 14:21:51.130115 4636 scope.go:117] "RemoveContainer" containerID="7574bf11dca3cc98902d1c641bba262051ce8b6c6327da1929b4bab419164808" Oct 03 14:21:51 crc kubenswrapper[4636]: I1003 14:21:51.291581 4636 scope.go:117] "RemoveContainer" containerID="d2d5aaa2320034de18acbb11c3dd24e37ee46daea1180ce481e452f800a78c3a" Oct 03 14:21:51 crc kubenswrapper[4636]: I1003 14:21:51.627316 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:21:51 crc kubenswrapper[4636]: W1003 14:21:51.639371 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03b9cd0b_5397_46c3_af28_f6e766fc596b.slice/crio-7b6af38c7b40ebd6ba67b770af8702bb27fd394bd7ea17c4c363b599efbbd055 WatchSource:0}: Error finding container 7b6af38c7b40ebd6ba67b770af8702bb27fd394bd7ea17c4c363b599efbbd055: Status 404 returned error can't find the container with id 7b6af38c7b40ebd6ba67b770af8702bb27fd394bd7ea17c4c363b599efbbd055 Oct 03 14:21:51 crc kubenswrapper[4636]: I1003 14:21:51.867055 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03b9cd0b-5397-46c3-af28-f6e766fc596b","Type":"ContainerStarted","Data":"7b6af38c7b40ebd6ba67b770af8702bb27fd394bd7ea17c4c363b599efbbd055"} Oct 03 14:21:51 crc kubenswrapper[4636]: I1003 14:21:51.870236 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rw5t7" event={"ID":"fb2e76a1-5457-484c-b311-c46b1eecec12","Type":"ContainerStarted","Data":"c5be79b6e2407ac3e8070c2de882f889bd58565c08bea0179ce362e5016be510"} Oct 03 14:21:52 crc kubenswrapper[4636]: I1003 14:21:52.881914 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03b9cd0b-5397-46c3-af28-f6e766fc596b","Type":"ContainerStarted","Data":"5fdbce835f4bd9edba3b6583c6d89e3188bf203ad6471b9ff62a7126f26b9527"} Oct 03 14:21:53 crc kubenswrapper[4636]: I1003 14:21:53.895023 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03b9cd0b-5397-46c3-af28-f6e766fc596b","Type":"ContainerStarted","Data":"c31fea35d7abaa093f313d967f734147cfac64f7e364633847676dfae568815b"} Oct 03 14:21:54 crc kubenswrapper[4636]: I1003 14:21:54.926550 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03b9cd0b-5397-46c3-af28-f6e766fc596b","Type":"ContainerStarted","Data":"383c4d2f0de37f68dbfba57a9e6f2134a2f1e06da2900b25687eb437d9f8e71e"} Oct 03 14:21:55 crc kubenswrapper[4636]: I1003 14:21:55.938132 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03b9cd0b-5397-46c3-af28-f6e766fc596b","Type":"ContainerStarted","Data":"564f47632c5e6bbbcd8cfc439fe5662bcecb6c0f12457b69a6c9058adf9d44e5"} Oct 03 14:21:55 crc kubenswrapper[4636]: I1003 14:21:55.938421 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 14:21:55 crc kubenswrapper[4636]: I1003 14:21:55.963486 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.013097141 podStartE2EDuration="6.963470012s" podCreationTimestamp="2025-10-03 14:21:49 +0000 UTC" firstStartedPulling="2025-10-03 14:21:51.642547547 +0000 UTC m=+1261.501273794" lastFinishedPulling="2025-10-03 14:21:55.592920418 +0000 UTC m=+1265.451646665" observedRunningTime="2025-10-03 14:21:55.961820495 +0000 UTC m=+1265.820546752" watchObservedRunningTime="2025-10-03 14:21:55.963470012 +0000 UTC m=+1265.822196269" Oct 03 14:21:55 crc kubenswrapper[4636]: I1003 14:21:55.974729 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rw5t7" podStartSLOduration=5.962210345 podStartE2EDuration="15.974707069s" podCreationTimestamp="2025-10-03 14:21:40 +0000 UTC" firstStartedPulling="2025-10-03 14:21:41.129993473 +0000 UTC m=+1250.988719720" lastFinishedPulling="2025-10-03 14:21:51.142490197 +0000 UTC m=+1261.001216444" observedRunningTime="2025-10-03 14:21:51.888179407 +0000 UTC m=+1261.746905654" watchObservedRunningTime="2025-10-03 14:21:55.974707069 +0000 UTC m=+1265.833433306" Oct 03 14:21:57 crc kubenswrapper[4636]: I1003 14:21:57.692896 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7976d47688-kx5v5" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 03 14:21:57 crc kubenswrapper[4636]: I1003 14:21:57.693380 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.006669 4636 generic.go:334] "Generic (PLEG): container finished" podID="fb2e76a1-5457-484c-b311-c46b1eecec12" containerID="c5be79b6e2407ac3e8070c2de882f889bd58565c08bea0179ce362e5016be510" exitCode=0 Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.008267 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rw5t7" event={"ID":"fb2e76a1-5457-484c-b311-c46b1eecec12","Type":"ContainerDied","Data":"c5be79b6e2407ac3e8070c2de882f889bd58565c08bea0179ce362e5016be510"} Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.014703 4636 generic.go:334] "Generic (PLEG): container finished" podID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerID="f07e34e6b6d7315b30759822fbb4041e8a844861251cba7541a6632092e00e7f" exitCode=137 Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.014766 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976d47688-kx5v5" event={"ID":"92ef2fa8-5e4e-49f1-8840-01b5be29d036","Type":"ContainerDied","Data":"f07e34e6b6d7315b30759822fbb4041e8a844861251cba7541a6632092e00e7f"} Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.134371 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.222213 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vccvt\" (UniqueName: \"kubernetes.io/projected/92ef2fa8-5e4e-49f1-8840-01b5be29d036-kube-api-access-vccvt\") pod \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.222490 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-secret-key\") pod \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.222704 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92ef2fa8-5e4e-49f1-8840-01b5be29d036-logs\") pod \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.222859 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-scripts\") pod \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.223011 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-combined-ca-bundle\") pod \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.223182 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-tls-certs\") pod \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.223284 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-config-data\") pod \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\" (UID: \"92ef2fa8-5e4e-49f1-8840-01b5be29d036\") " Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.223233 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92ef2fa8-5e4e-49f1-8840-01b5be29d036-logs" (OuterVolumeSpecName: "logs") pod "92ef2fa8-5e4e-49f1-8840-01b5be29d036" (UID: "92ef2fa8-5e4e-49f1-8840-01b5be29d036"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.229864 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "92ef2fa8-5e4e-49f1-8840-01b5be29d036" (UID: "92ef2fa8-5e4e-49f1-8840-01b5be29d036"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.236285 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92ef2fa8-5e4e-49f1-8840-01b5be29d036-kube-api-access-vccvt" (OuterVolumeSpecName: "kube-api-access-vccvt") pod "92ef2fa8-5e4e-49f1-8840-01b5be29d036" (UID: "92ef2fa8-5e4e-49f1-8840-01b5be29d036"). InnerVolumeSpecName "kube-api-access-vccvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.261808 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92ef2fa8-5e4e-49f1-8840-01b5be29d036" (UID: "92ef2fa8-5e4e-49f1-8840-01b5be29d036"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.263669 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-scripts" (OuterVolumeSpecName: "scripts") pod "92ef2fa8-5e4e-49f1-8840-01b5be29d036" (UID: "92ef2fa8-5e4e-49f1-8840-01b5be29d036"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.265592 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-config-data" (OuterVolumeSpecName: "config-data") pod "92ef2fa8-5e4e-49f1-8840-01b5be29d036" (UID: "92ef2fa8-5e4e-49f1-8840-01b5be29d036"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.292031 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "92ef2fa8-5e4e-49f1-8840-01b5be29d036" (UID: "92ef2fa8-5e4e-49f1-8840-01b5be29d036"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.326221 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vccvt\" (UniqueName: \"kubernetes.io/projected/92ef2fa8-5e4e-49f1-8840-01b5be29d036-kube-api-access-vccvt\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.326524 4636 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.326669 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92ef2fa8-5e4e-49f1-8840-01b5be29d036-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.326789 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.326947 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.327123 4636 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/92ef2fa8-5e4e-49f1-8840-01b5be29d036-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:03 crc kubenswrapper[4636]: I1003 14:22:03.327246 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92ef2fa8-5e4e-49f1-8840-01b5be29d036-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.032729 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7976d47688-kx5v5" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.034231 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976d47688-kx5v5" event={"ID":"92ef2fa8-5e4e-49f1-8840-01b5be29d036","Type":"ContainerDied","Data":"9f3ab565914dc40550b11d09227c0d9ef5e347c0c2ce97b84ce364dc428f4293"} Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.034308 4636 scope.go:117] "RemoveContainer" containerID="04ea3c8403ac5e4db0992d3e740398054c3ed5e4c5fb3d90d47fa1626284e1c7" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.077406 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7976d47688-kx5v5"] Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.085421 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7976d47688-kx5v5"] Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.232255 4636 scope.go:117] "RemoveContainer" containerID="f07e34e6b6d7315b30759822fbb4041e8a844861251cba7541a6632092e00e7f" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.417887 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.547534 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-combined-ca-bundle\") pod \"fb2e76a1-5457-484c-b311-c46b1eecec12\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.547626 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-scripts\") pod \"fb2e76a1-5457-484c-b311-c46b1eecec12\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.547768 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-config-data\") pod \"fb2e76a1-5457-484c-b311-c46b1eecec12\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.547836 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btxl2\" (UniqueName: \"kubernetes.io/projected/fb2e76a1-5457-484c-b311-c46b1eecec12-kube-api-access-btxl2\") pod \"fb2e76a1-5457-484c-b311-c46b1eecec12\" (UID: \"fb2e76a1-5457-484c-b311-c46b1eecec12\") " Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.553242 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-scripts" (OuterVolumeSpecName: "scripts") pod "fb2e76a1-5457-484c-b311-c46b1eecec12" (UID: "fb2e76a1-5457-484c-b311-c46b1eecec12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.553341 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2e76a1-5457-484c-b311-c46b1eecec12-kube-api-access-btxl2" (OuterVolumeSpecName: "kube-api-access-btxl2") pod "fb2e76a1-5457-484c-b311-c46b1eecec12" (UID: "fb2e76a1-5457-484c-b311-c46b1eecec12"). InnerVolumeSpecName "kube-api-access-btxl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.577741 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb2e76a1-5457-484c-b311-c46b1eecec12" (UID: "fb2e76a1-5457-484c-b311-c46b1eecec12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.577961 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-config-data" (OuterVolumeSpecName: "config-data") pod "fb2e76a1-5457-484c-b311-c46b1eecec12" (UID: "fb2e76a1-5457-484c-b311-c46b1eecec12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.651428 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.651485 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.651534 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2e76a1-5457-484c-b311-c46b1eecec12-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.651548 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btxl2\" (UniqueName: \"kubernetes.io/projected/fb2e76a1-5457-484c-b311-c46b1eecec12-kube-api-access-btxl2\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:04 crc kubenswrapper[4636]: I1003 14:22:04.804876 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" path="/var/lib/kubelet/pods/92ef2fa8-5e4e-49f1-8840-01b5be29d036/volumes" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.044576 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rw5t7" event={"ID":"fb2e76a1-5457-484c-b311-c46b1eecec12","Type":"ContainerDied","Data":"3482dc5e368b5c92a867c048d17ff6d312c3cd1cde623f4eb43a8209b34f08a0"} Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.045184 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3482dc5e368b5c92a867c048d17ff6d312c3cd1cde623f4eb43a8209b34f08a0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.044610 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rw5t7" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.128790 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:22:05 crc kubenswrapper[4636]: E1003 14:22:05.129276 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2e76a1-5457-484c-b311-c46b1eecec12" containerName="nova-cell0-conductor-db-sync" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.129299 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2e76a1-5457-484c-b311-c46b1eecec12" containerName="nova-cell0-conductor-db-sync" Oct 03 14:22:05 crc kubenswrapper[4636]: E1003 14:22:05.129312 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon-log" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.129320 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon-log" Oct 03 14:22:05 crc kubenswrapper[4636]: E1003 14:22:05.129341 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.129349 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" Oct 03 14:22:05 crc kubenswrapper[4636]: E1003 14:22:05.129367 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.129374 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.129604 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon-log" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.129622 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.129631 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ef2fa8-5e4e-49f1-8840-01b5be29d036" containerName="horizon" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.129649 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2e76a1-5457-484c-b311-c46b1eecec12" containerName="nova-cell0-conductor-db-sync" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.130499 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.132565 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.136235 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9xcsh" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.143971 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.261480 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efbece4d-3b40-41b8-819a-9dac3cf42b21-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"efbece4d-3b40-41b8-819a-9dac3cf42b21\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.261646 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efbece4d-3b40-41b8-819a-9dac3cf42b21-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"efbece4d-3b40-41b8-819a-9dac3cf42b21\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.261681 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s9bf\" (UniqueName: \"kubernetes.io/projected/efbece4d-3b40-41b8-819a-9dac3cf42b21-kube-api-access-9s9bf\") pod \"nova-cell0-conductor-0\" (UID: \"efbece4d-3b40-41b8-819a-9dac3cf42b21\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.363616 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efbece4d-3b40-41b8-819a-9dac3cf42b21-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"efbece4d-3b40-41b8-819a-9dac3cf42b21\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.364316 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s9bf\" (UniqueName: \"kubernetes.io/projected/efbece4d-3b40-41b8-819a-9dac3cf42b21-kube-api-access-9s9bf\") pod \"nova-cell0-conductor-0\" (UID: \"efbece4d-3b40-41b8-819a-9dac3cf42b21\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.364411 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efbece4d-3b40-41b8-819a-9dac3cf42b21-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"efbece4d-3b40-41b8-819a-9dac3cf42b21\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.367803 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efbece4d-3b40-41b8-819a-9dac3cf42b21-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"efbece4d-3b40-41b8-819a-9dac3cf42b21\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.378861 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efbece4d-3b40-41b8-819a-9dac3cf42b21-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"efbece4d-3b40-41b8-819a-9dac3cf42b21\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.381359 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s9bf\" (UniqueName: \"kubernetes.io/projected/efbece4d-3b40-41b8-819a-9dac3cf42b21-kube-api-access-9s9bf\") pod \"nova-cell0-conductor-0\" (UID: \"efbece4d-3b40-41b8-819a-9dac3cf42b21\") " pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.451711 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:05 crc kubenswrapper[4636]: I1003 14:22:05.868251 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 14:22:06 crc kubenswrapper[4636]: I1003 14:22:06.056604 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"efbece4d-3b40-41b8-819a-9dac3cf42b21","Type":"ContainerStarted","Data":"2c4de159e2fbe419570c78e2a70eef49f7a42533bb7998469b2d8babcc6e31fb"} Oct 03 14:22:07 crc kubenswrapper[4636]: I1003 14:22:07.069131 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"efbece4d-3b40-41b8-819a-9dac3cf42b21","Type":"ContainerStarted","Data":"7f7f7dd61ac692c35ec33b64bd51fd159bb4cf4b56b420e274b594eb73b67f05"} Oct 03 14:22:07 crc kubenswrapper[4636]: I1003 14:22:07.069778 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:07 crc kubenswrapper[4636]: I1003 14:22:07.096523 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.096508273 podStartE2EDuration="2.096508273s" podCreationTimestamp="2025-10-03 14:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:07.09357467 +0000 UTC m=+1276.952300917" watchObservedRunningTime="2025-10-03 14:22:07.096508273 +0000 UTC m=+1276.955234520" Oct 03 14:22:15 crc kubenswrapper[4636]: I1003 14:22:15.488842 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.036353 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9l45j"] Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.037822 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.051901 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9l45j"] Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.060693 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.062707 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.172200 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkrxh\" (UniqueName: \"kubernetes.io/projected/7f5bfa77-9c30-4f65-900b-62595074b467-kube-api-access-nkrxh\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.172631 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.172736 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-scripts\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.172787 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-config-data\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.244792 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.246236 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.255582 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.275167 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.275258 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-scripts\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.275318 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-config-data\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.275399 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkrxh\" (UniqueName: \"kubernetes.io/projected/7f5bfa77-9c30-4f65-900b-62595074b467-kube-api-access-nkrxh\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.286021 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-scripts\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.295914 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.303464 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.303942 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-config-data\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.339860 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkrxh\" (UniqueName: \"kubernetes.io/projected/7f5bfa77-9c30-4f65-900b-62595074b467-kube-api-access-nkrxh\") pod \"nova-cell0-cell-mapping-9l45j\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.378029 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.379053 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.379256 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.379323 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtgbh\" (UniqueName: \"kubernetes.io/projected/0517677b-2391-46df-a1a8-e73266c4e056-kube-api-access-rtgbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.440287 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.442360 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.454226 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.455325 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.491018 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.491134 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtgbh\" (UniqueName: \"kubernetes.io/projected/0517677b-2391-46df-a1a8-e73266c4e056-kube-api-access-rtgbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.491181 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.497791 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.500606 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.524272 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.535538 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.558959 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.594690 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.601727 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.601807 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ld68\" (UniqueName: \"kubernetes.io/projected/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-kube-api-access-7ld68\") pod \"nova-scheduler-0\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.601946 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-config-data\") pod \"nova-scheduler-0\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.602074 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47nlg\" (UniqueName: \"kubernetes.io/projected/53cf2edd-da8b-4739-8e6f-24948c5d84c4-kube-api-access-47nlg\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.605440 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-config-data\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.605501 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cf2edd-da8b-4739-8e6f-24948c5d84c4-logs\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.605601 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.711295 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47nlg\" (UniqueName: \"kubernetes.io/projected/53cf2edd-da8b-4739-8e6f-24948c5d84c4-kube-api-access-47nlg\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.711407 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-config-data\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.711437 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cf2edd-da8b-4739-8e6f-24948c5d84c4-logs\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.711536 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.711663 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.711706 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ld68\" (UniqueName: \"kubernetes.io/projected/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-kube-api-access-7ld68\") pod \"nova-scheduler-0\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.711840 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-config-data\") pod \"nova-scheduler-0\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.720152 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cf2edd-da8b-4739-8e6f-24948c5d84c4-logs\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.727069 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtgbh\" (UniqueName: \"kubernetes.io/projected/0517677b-2391-46df-a1a8-e73266c4e056-kube-api-access-rtgbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.770275 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-config-data\") pod \"nova-scheduler-0\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.771675 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.779655 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.786226 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-config-data\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.843493 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ld68\" (UniqueName: \"kubernetes.io/projected/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-kube-api-access-7ld68\") pod \"nova-scheduler-0\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.863845 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.865931 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47nlg\" (UniqueName: \"kubernetes.io/projected/53cf2edd-da8b-4739-8e6f-24948c5d84c4-kube-api-access-47nlg\") pod \"nova-metadata-0\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " pod="openstack/nova-metadata-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.887046 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.928046 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-xcbk7"] Oct 03 14:22:16 crc kubenswrapper[4636]: I1003 14:22:16.935312 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.004456 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-xcbk7"] Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.054500 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.056347 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.025156 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.057966 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-872nj\" (UniqueName: \"kubernetes.io/projected/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-kube-api-access-872nj\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.058085 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.066344 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-svc\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.066407 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-config\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.066612 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.064748 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.066627 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.109851 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.174717 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.174788 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjkcl\" (UniqueName: \"kubernetes.io/projected/b5ccdfa8-6bab-4376-8abf-8def038929c5-kube-api-access-pjkcl\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.174860 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.174904 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5ccdfa8-6bab-4376-8abf-8def038929c5-logs\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.174930 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.174978 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-872nj\" (UniqueName: \"kubernetes.io/projected/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-kube-api-access-872nj\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.175020 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.175082 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-svc\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.175120 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-config-data\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.175148 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-config\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.176152 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-config\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.176795 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-svc\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.176885 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.177446 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.188788 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.211828 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-872nj\" (UniqueName: \"kubernetes.io/projected/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-kube-api-access-872nj\") pod \"dnsmasq-dns-757b4f8459-xcbk7\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.279009 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjkcl\" (UniqueName: \"kubernetes.io/projected/b5ccdfa8-6bab-4376-8abf-8def038929c5-kube-api-access-pjkcl\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.279477 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5ccdfa8-6bab-4376-8abf-8def038929c5-logs\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.279571 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-config-data\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.279611 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.280819 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5ccdfa8-6bab-4376-8abf-8def038929c5-logs\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.282290 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.307623 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-config-data\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.308144 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.318995 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjkcl\" (UniqueName: \"kubernetes.io/projected/b5ccdfa8-6bab-4376-8abf-8def038929c5-kube-api-access-pjkcl\") pod \"nova-api-0\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.419027 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.430546 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9l45j"] Oct 03 14:22:17 crc kubenswrapper[4636]: W1003 14:22:17.500088 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f5bfa77_9c30_4f65_900b_62595074b467.slice/crio-1f4cd6149225356f4f00a12e247d72f920399745272be83c4721cb2264a7e729 WatchSource:0}: Error finding container 1f4cd6149225356f4f00a12e247d72f920399745272be83c4721cb2264a7e729: Status 404 returned error can't find the container with id 1f4cd6149225356f4f00a12e247d72f920399745272be83c4721cb2264a7e729 Oct 03 14:22:17 crc kubenswrapper[4636]: I1003 14:22:17.877459 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:22:18 crc kubenswrapper[4636]: W1003 14:22:18.075960 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode025b6c1_c1da_4cc7_b7e1_9c271aa45f4c.slice/crio-66fe638930f0b7955a7dbdcfe4b10374da2108fe7fd8f7b94376656a9efbaabb WatchSource:0}: Error finding container 66fe638930f0b7955a7dbdcfe4b10374da2108fe7fd8f7b94376656a9efbaabb: Status 404 returned error can't find the container with id 66fe638930f0b7955a7dbdcfe4b10374da2108fe7fd8f7b94376656a9efbaabb Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.079672 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.086450 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.155702 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-xcbk7"] Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.194663 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c","Type":"ContainerStarted","Data":"66fe638930f0b7955a7dbdcfe4b10374da2108fe7fd8f7b94376656a9efbaabb"} Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.198656 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" event={"ID":"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9","Type":"ContainerStarted","Data":"4b12f278d2381a1bef3dd17586469d8ab1a99e3f6e172e627706901d8e21a8cb"} Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.202514 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9l45j" event={"ID":"7f5bfa77-9c30-4f65-900b-62595074b467","Type":"ContainerStarted","Data":"cf31480d5894ef92aab08f7c038b7f96f19f5cdbe5835559a8bf45bff7ecda9d"} Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.202559 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9l45j" event={"ID":"7f5bfa77-9c30-4f65-900b-62595074b467","Type":"ContainerStarted","Data":"1f4cd6149225356f4f00a12e247d72f920399745272be83c4721cb2264a7e729"} Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.206158 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cf2edd-da8b-4739-8e6f-24948c5d84c4","Type":"ContainerStarted","Data":"dfb81f1f365734fb24cb6430f4da71670f6029b157fff8475b9f23fa66a96299"} Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.208204 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0517677b-2391-46df-a1a8-e73266c4e056","Type":"ContainerStarted","Data":"3263648b1a69c794a9dd9c805e04d85e9006be8a3fcefe84f1bea57c3ff367ee"} Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.224580 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9l45j" podStartSLOduration=2.224411042 podStartE2EDuration="2.224411042s" podCreationTimestamp="2025-10-03 14:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:18.221766023 +0000 UTC m=+1288.080492280" watchObservedRunningTime="2025-10-03 14:22:18.224411042 +0000 UTC m=+1288.083137349" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.267409 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.321444 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9b2q2"] Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.323253 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.329205 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.329413 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.329505 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9b2q2"] Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.411318 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.411650 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-config-data\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.411733 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gtj2\" (UniqueName: \"kubernetes.io/projected/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-kube-api-access-9gtj2\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.411754 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-scripts\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.513389 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-config-data\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.513457 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gtj2\" (UniqueName: \"kubernetes.io/projected/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-kube-api-access-9gtj2\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.513477 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-scripts\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.513578 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.522715 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-config-data\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.523072 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.523447 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-scripts\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.534213 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gtj2\" (UniqueName: \"kubernetes.io/projected/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-kube-api-access-9gtj2\") pod \"nova-cell1-conductor-db-sync-9b2q2\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:18 crc kubenswrapper[4636]: I1003 14:22:18.767310 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:19 crc kubenswrapper[4636]: I1003 14:22:19.223538 4636 generic.go:334] "Generic (PLEG): container finished" podID="83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" containerID="2edcfcda25643af1c3527e1726f933daf77199b509187f8cfd0f5ea5b52ff180" exitCode=0 Oct 03 14:22:19 crc kubenswrapper[4636]: I1003 14:22:19.223702 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" event={"ID":"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9","Type":"ContainerDied","Data":"2edcfcda25643af1c3527e1726f933daf77199b509187f8cfd0f5ea5b52ff180"} Oct 03 14:22:19 crc kubenswrapper[4636]: I1003 14:22:19.229665 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5ccdfa8-6bab-4376-8abf-8def038929c5","Type":"ContainerStarted","Data":"41b4ff012e97a3dc06e7b5adeff0058d7cf519116e1b9c66604c59a2168fbc75"} Oct 03 14:22:19 crc kubenswrapper[4636]: I1003 14:22:19.361576 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9b2q2"] Oct 03 14:22:20 crc kubenswrapper[4636]: I1003 14:22:20.258625 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9b2q2" event={"ID":"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5","Type":"ContainerStarted","Data":"da82b036efdbc7e73e12a5ebefa44760b78b445f536f231f90c1a47b60593634"} Oct 03 14:22:20 crc kubenswrapper[4636]: I1003 14:22:20.259820 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9b2q2" event={"ID":"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5","Type":"ContainerStarted","Data":"015679a27ba3199cca1f6f7b0c6d7363dec01238fc4847f08a1a2b033c044c80"} Oct 03 14:22:20 crc kubenswrapper[4636]: I1003 14:22:20.264232 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" event={"ID":"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9","Type":"ContainerStarted","Data":"185b5522a1d25463a32210cfa6011ad870d1f668cd64096f563d9f3a75c7dd92"} Oct 03 14:22:20 crc kubenswrapper[4636]: I1003 14:22:20.264450 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:20 crc kubenswrapper[4636]: I1003 14:22:20.278495 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9b2q2" podStartSLOduration=2.278474527 podStartE2EDuration="2.278474527s" podCreationTimestamp="2025-10-03 14:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:20.274016362 +0000 UTC m=+1290.132742619" watchObservedRunningTime="2025-10-03 14:22:20.278474527 +0000 UTC m=+1290.137200774" Oct 03 14:22:20 crc kubenswrapper[4636]: I1003 14:22:20.312647 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 14:22:20 crc kubenswrapper[4636]: I1003 14:22:20.348894 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" podStartSLOduration=4.348868567 podStartE2EDuration="4.348868567s" podCreationTimestamp="2025-10-03 14:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:20.291379501 +0000 UTC m=+1290.150105748" watchObservedRunningTime="2025-10-03 14:22:20.348868567 +0000 UTC m=+1290.207594814" Oct 03 14:22:20 crc kubenswrapper[4636]: I1003 14:22:20.575987 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:20 crc kubenswrapper[4636]: I1003 14:22:20.608508 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.337407 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cf2edd-da8b-4739-8e6f-24948c5d84c4","Type":"ContainerStarted","Data":"264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b"} Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.337939 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cf2edd-da8b-4739-8e6f-24948c5d84c4","Type":"ContainerStarted","Data":"a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c"} Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.337895 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53cf2edd-da8b-4739-8e6f-24948c5d84c4" containerName="nova-metadata-log" containerID="cri-o://a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c" gracePeriod=30 Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.337930 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53cf2edd-da8b-4739-8e6f-24948c5d84c4" containerName="nova-metadata-metadata" containerID="cri-o://264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b" gracePeriod=30 Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.341036 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0517677b-2391-46df-a1a8-e73266c4e056" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f" gracePeriod=30 Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.341155 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0517677b-2391-46df-a1a8-e73266c4e056","Type":"ContainerStarted","Data":"b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f"} Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.346541 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c","Type":"ContainerStarted","Data":"56303358c54c5d57b132441046c61fcfc6116d91143dd3a53c24c8a7f24ef2b1"} Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.350769 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5ccdfa8-6bab-4376-8abf-8def038929c5","Type":"ContainerStarted","Data":"8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7"} Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.350812 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5ccdfa8-6bab-4376-8abf-8def038929c5","Type":"ContainerStarted","Data":"96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46"} Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.358966 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.348753049 podStartE2EDuration="9.358949637s" podCreationTimestamp="2025-10-03 14:22:16 +0000 UTC" firstStartedPulling="2025-10-03 14:22:18.079857106 +0000 UTC m=+1287.938583353" lastFinishedPulling="2025-10-03 14:22:24.090053694 +0000 UTC m=+1293.948779941" observedRunningTime="2025-10-03 14:22:25.358603748 +0000 UTC m=+1295.217330005" watchObservedRunningTime="2025-10-03 14:22:25.358949637 +0000 UTC m=+1295.217675884" Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.383932 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.196276258 podStartE2EDuration="9.383913952s" podCreationTimestamp="2025-10-03 14:22:16 +0000 UTC" firstStartedPulling="2025-10-03 14:22:17.901951968 +0000 UTC m=+1287.760678215" lastFinishedPulling="2025-10-03 14:22:24.089589662 +0000 UTC m=+1293.948315909" observedRunningTime="2025-10-03 14:22:25.380226237 +0000 UTC m=+1295.238952504" watchObservedRunningTime="2025-10-03 14:22:25.383913952 +0000 UTC m=+1295.242640199" Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.424300 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.625803618 podStartE2EDuration="9.424280895s" podCreationTimestamp="2025-10-03 14:22:16 +0000 UTC" firstStartedPulling="2025-10-03 14:22:18.296213187 +0000 UTC m=+1288.154939424" lastFinishedPulling="2025-10-03 14:22:24.094690454 +0000 UTC m=+1293.953416701" observedRunningTime="2025-10-03 14:22:25.407940913 +0000 UTC m=+1295.266667190" watchObservedRunningTime="2025-10-03 14:22:25.424280895 +0000 UTC m=+1295.283007142" Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.426959 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.417280069 podStartE2EDuration="9.426950394s" podCreationTimestamp="2025-10-03 14:22:16 +0000 UTC" firstStartedPulling="2025-10-03 14:22:18.079880956 +0000 UTC m=+1287.938607203" lastFinishedPulling="2025-10-03 14:22:24.089551291 +0000 UTC m=+1293.948277528" observedRunningTime="2025-10-03 14:22:25.424144722 +0000 UTC m=+1295.282870979" watchObservedRunningTime="2025-10-03 14:22:25.426950394 +0000 UTC m=+1295.285676641" Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.961265 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.998861 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cf2edd-da8b-4739-8e6f-24948c5d84c4-logs\") pod \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.998967 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-combined-ca-bundle\") pod \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.998998 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-config-data\") pod \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.999209 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47nlg\" (UniqueName: \"kubernetes.io/projected/53cf2edd-da8b-4739-8e6f-24948c5d84c4-kube-api-access-47nlg\") pod \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\" (UID: \"53cf2edd-da8b-4739-8e6f-24948c5d84c4\") " Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.999328 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53cf2edd-da8b-4739-8e6f-24948c5d84c4-logs" (OuterVolumeSpecName: "logs") pod "53cf2edd-da8b-4739-8e6f-24948c5d84c4" (UID: "53cf2edd-da8b-4739-8e6f-24948c5d84c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:22:25 crc kubenswrapper[4636]: I1003 14:22:25.999769 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53cf2edd-da8b-4739-8e6f-24948c5d84c4-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.009017 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cf2edd-da8b-4739-8e6f-24948c5d84c4-kube-api-access-47nlg" (OuterVolumeSpecName: "kube-api-access-47nlg") pod "53cf2edd-da8b-4739-8e6f-24948c5d84c4" (UID: "53cf2edd-da8b-4739-8e6f-24948c5d84c4"). InnerVolumeSpecName "kube-api-access-47nlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.049202 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53cf2edd-da8b-4739-8e6f-24948c5d84c4" (UID: "53cf2edd-da8b-4739-8e6f-24948c5d84c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.049659 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-config-data" (OuterVolumeSpecName: "config-data") pod "53cf2edd-da8b-4739-8e6f-24948c5d84c4" (UID: "53cf2edd-da8b-4739-8e6f-24948c5d84c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.101775 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47nlg\" (UniqueName: \"kubernetes.io/projected/53cf2edd-da8b-4739-8e6f-24948c5d84c4-kube-api-access-47nlg\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.101809 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.101820 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53cf2edd-da8b-4739-8e6f-24948c5d84c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.368777 4636 generic.go:334] "Generic (PLEG): container finished" podID="53cf2edd-da8b-4739-8e6f-24948c5d84c4" containerID="264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b" exitCode=0 Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.368805 4636 generic.go:334] "Generic (PLEG): container finished" podID="53cf2edd-da8b-4739-8e6f-24948c5d84c4" containerID="a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c" exitCode=143 Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.369674 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.373273 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cf2edd-da8b-4739-8e6f-24948c5d84c4","Type":"ContainerDied","Data":"264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b"} Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.373345 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cf2edd-da8b-4739-8e6f-24948c5d84c4","Type":"ContainerDied","Data":"a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c"} Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.373356 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53cf2edd-da8b-4739-8e6f-24948c5d84c4","Type":"ContainerDied","Data":"dfb81f1f365734fb24cb6430f4da71670f6029b157fff8475b9f23fa66a96299"} Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.373376 4636 scope.go:117] "RemoveContainer" containerID="264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.403525 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.412508 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.418948 4636 scope.go:117] "RemoveContainer" containerID="a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.429229 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:26 crc kubenswrapper[4636]: E1003 14:22:26.429596 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cf2edd-da8b-4739-8e6f-24948c5d84c4" containerName="nova-metadata-log" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.429611 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cf2edd-da8b-4739-8e6f-24948c5d84c4" containerName="nova-metadata-log" Oct 03 14:22:26 crc kubenswrapper[4636]: E1003 14:22:26.429643 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cf2edd-da8b-4739-8e6f-24948c5d84c4" containerName="nova-metadata-metadata" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.429649 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cf2edd-da8b-4739-8e6f-24948c5d84c4" containerName="nova-metadata-metadata" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.429845 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cf2edd-da8b-4739-8e6f-24948c5d84c4" containerName="nova-metadata-metadata" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.429865 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cf2edd-da8b-4739-8e6f-24948c5d84c4" containerName="nova-metadata-log" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.438091 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.440996 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.446060 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.447950 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.448402 4636 scope.go:117] "RemoveContainer" containerID="264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b" Oct 03 14:22:26 crc kubenswrapper[4636]: E1003 14:22:26.454141 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b\": container with ID starting with 264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b not found: ID does not exist" containerID="264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.454184 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b"} err="failed to get container status \"264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b\": rpc error: code = NotFound desc = could not find container \"264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b\": container with ID starting with 264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b not found: ID does not exist" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.454212 4636 scope.go:117] "RemoveContainer" containerID="a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c" Oct 03 14:22:26 crc kubenswrapper[4636]: E1003 14:22:26.459271 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c\": container with ID starting with a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c not found: ID does not exist" containerID="a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.459316 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c"} err="failed to get container status \"a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c\": rpc error: code = NotFound desc = could not find container \"a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c\": container with ID starting with a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c not found: ID does not exist" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.459343 4636 scope.go:117] "RemoveContainer" containerID="264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.459729 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b"} err="failed to get container status \"264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b\": rpc error: code = NotFound desc = could not find container \"264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b\": container with ID starting with 264407c3c4b870c8e7945453958885e6fffb86058ef043f5da764387f6adad8b not found: ID does not exist" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.459764 4636 scope.go:117] "RemoveContainer" containerID="a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.459976 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c"} err="failed to get container status \"a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c\": rpc error: code = NotFound desc = could not find container \"a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c\": container with ID starting with a898be59dbd89538af4ee14f6dccbde69e524f52fc8ca85d540c95602c1e9e7c not found: ID does not exist" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.507355 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jx8c\" (UniqueName: \"kubernetes.io/projected/0b554bdd-2609-4b94-b752-c1380a53cb25-kube-api-access-9jx8c\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.507420 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.507449 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.507800 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-config-data\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.507889 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b554bdd-2609-4b94-b752-c1380a53cb25-logs\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.609897 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-config-data\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.610004 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b554bdd-2609-4b94-b752-c1380a53cb25-logs\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.610076 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jx8c\" (UniqueName: \"kubernetes.io/projected/0b554bdd-2609-4b94-b752-c1380a53cb25-kube-api-access-9jx8c\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.610119 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.610146 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.610511 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b554bdd-2609-4b94-b752-c1380a53cb25-logs\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.615716 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-config-data\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.618634 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.622914 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.635603 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jx8c\" (UniqueName: \"kubernetes.io/projected/0b554bdd-2609-4b94-b752-c1380a53cb25-kube-api-access-9jx8c\") pod \"nova-metadata-0\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.766335 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.814780 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53cf2edd-da8b-4739-8e6f-24948c5d84c4" path="/var/lib/kubelet/pods/53cf2edd-da8b-4739-8e6f-24948c5d84c4/volumes" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.865539 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.888724 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.888981 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 14:22:26 crc kubenswrapper[4636]: I1003 14:22:26.930799 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 14:22:27 crc kubenswrapper[4636]: I1003 14:22:27.230236 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:27 crc kubenswrapper[4636]: W1003 14:22:27.237315 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b554bdd_2609_4b94_b752_c1380a53cb25.slice/crio-5c60c0efc5dd3813239b75ef2862feedc4bcc18e9e392cd0f074a23b05e2ca5f WatchSource:0}: Error finding container 5c60c0efc5dd3813239b75ef2862feedc4bcc18e9e392cd0f074a23b05e2ca5f: Status 404 returned error can't find the container with id 5c60c0efc5dd3813239b75ef2862feedc4bcc18e9e392cd0f074a23b05e2ca5f Oct 03 14:22:27 crc kubenswrapper[4636]: I1003 14:22:27.311135 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:22:27 crc kubenswrapper[4636]: I1003 14:22:27.376323 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-n62p4"] Oct 03 14:22:27 crc kubenswrapper[4636]: I1003 14:22:27.376584 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" podUID="04a66e39-5c5f-4473-a1af-376694a4f2cf" containerName="dnsmasq-dns" containerID="cri-o://98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4" gracePeriod=10 Oct 03 14:22:27 crc kubenswrapper[4636]: I1003 14:22:27.383286 4636 generic.go:334] "Generic (PLEG): container finished" podID="7f5bfa77-9c30-4f65-900b-62595074b467" containerID="cf31480d5894ef92aab08f7c038b7f96f19f5cdbe5835559a8bf45bff7ecda9d" exitCode=0 Oct 03 14:22:27 crc kubenswrapper[4636]: I1003 14:22:27.383365 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9l45j" event={"ID":"7f5bfa77-9c30-4f65-900b-62595074b467","Type":"ContainerDied","Data":"cf31480d5894ef92aab08f7c038b7f96f19f5cdbe5835559a8bf45bff7ecda9d"} Oct 03 14:22:27 crc kubenswrapper[4636]: I1003 14:22:27.398359 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b554bdd-2609-4b94-b752-c1380a53cb25","Type":"ContainerStarted","Data":"5c60c0efc5dd3813239b75ef2862feedc4bcc18e9e392cd0f074a23b05e2ca5f"} Oct 03 14:22:27 crc kubenswrapper[4636]: I1003 14:22:27.420411 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:22:27 crc kubenswrapper[4636]: I1003 14:22:27.420759 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:22:27 crc kubenswrapper[4636]: I1003 14:22:27.468785 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.384040 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.410214 4636 generic.go:334] "Generic (PLEG): container finished" podID="04a66e39-5c5f-4473-a1af-376694a4f2cf" containerID="98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4" exitCode=0 Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.410295 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" event={"ID":"04a66e39-5c5f-4473-a1af-376694a4f2cf","Type":"ContainerDied","Data":"98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4"} Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.410328 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" event={"ID":"04a66e39-5c5f-4473-a1af-376694a4f2cf","Type":"ContainerDied","Data":"b48a169572087d6331f71f50f50a58b40f74ac613086f0855400aea5de0ba07f"} Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.410347 4636 scope.go:117] "RemoveContainer" containerID="98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.410506 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-n62p4" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.441463 4636 scope.go:117] "RemoveContainer" containerID="42d59f20d9d3112092c15c388830898cb7bc8adfeeaa81215f6ab200b08a5d63" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.462596 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b554bdd-2609-4b94-b752-c1380a53cb25","Type":"ContainerStarted","Data":"71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3"} Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.462655 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b554bdd-2609-4b94-b752-c1380a53cb25","Type":"ContainerStarted","Data":"4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7"} Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.469412 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-svc\") pod \"04a66e39-5c5f-4473-a1af-376694a4f2cf\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.469486 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-config\") pod \"04a66e39-5c5f-4473-a1af-376694a4f2cf\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.469568 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsvzc\" (UniqueName: \"kubernetes.io/projected/04a66e39-5c5f-4473-a1af-376694a4f2cf-kube-api-access-lsvzc\") pod \"04a66e39-5c5f-4473-a1af-376694a4f2cf\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.469686 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-sb\") pod \"04a66e39-5c5f-4473-a1af-376694a4f2cf\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.469729 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-nb\") pod \"04a66e39-5c5f-4473-a1af-376694a4f2cf\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.469828 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-swift-storage-0\") pod \"04a66e39-5c5f-4473-a1af-376694a4f2cf\" (UID: \"04a66e39-5c5f-4473-a1af-376694a4f2cf\") " Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.500665 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a66e39-5c5f-4473-a1af-376694a4f2cf-kube-api-access-lsvzc" (OuterVolumeSpecName: "kube-api-access-lsvzc") pod "04a66e39-5c5f-4473-a1af-376694a4f2cf" (UID: "04a66e39-5c5f-4473-a1af-376694a4f2cf"). InnerVolumeSpecName "kube-api-access-lsvzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.506052 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.506329 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.514783 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.514767836 podStartE2EDuration="2.514767836s" podCreationTimestamp="2025-10-03 14:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:28.448450702 +0000 UTC m=+1298.307176959" watchObservedRunningTime="2025-10-03 14:22:28.514767836 +0000 UTC m=+1298.373494083" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.578165 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsvzc\" (UniqueName: \"kubernetes.io/projected/04a66e39-5c5f-4473-a1af-376694a4f2cf-kube-api-access-lsvzc\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.582075 4636 scope.go:117] "RemoveContainer" containerID="98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4" Oct 03 14:22:28 crc kubenswrapper[4636]: E1003 14:22:28.582552 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4\": container with ID starting with 98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4 not found: ID does not exist" containerID="98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.582586 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4"} err="failed to get container status \"98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4\": rpc error: code = NotFound desc = could not find container \"98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4\": container with ID starting with 98c2497c6ccb79309342f85a5bf129a0c835576a224d0085cb2a6afde4a680a4 not found: ID does not exist" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.582607 4636 scope.go:117] "RemoveContainer" containerID="42d59f20d9d3112092c15c388830898cb7bc8adfeeaa81215f6ab200b08a5d63" Oct 03 14:22:28 crc kubenswrapper[4636]: E1003 14:22:28.582895 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d59f20d9d3112092c15c388830898cb7bc8adfeeaa81215f6ab200b08a5d63\": container with ID starting with 42d59f20d9d3112092c15c388830898cb7bc8adfeeaa81215f6ab200b08a5d63 not found: ID does not exist" containerID="42d59f20d9d3112092c15c388830898cb7bc8adfeeaa81215f6ab200b08a5d63" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.582916 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d59f20d9d3112092c15c388830898cb7bc8adfeeaa81215f6ab200b08a5d63"} err="failed to get container status \"42d59f20d9d3112092c15c388830898cb7bc8adfeeaa81215f6ab200b08a5d63\": rpc error: code = NotFound desc = could not find container \"42d59f20d9d3112092c15c388830898cb7bc8adfeeaa81215f6ab200b08a5d63\": container with ID starting with 42d59f20d9d3112092c15c388830898cb7bc8adfeeaa81215f6ab200b08a5d63 not found: ID does not exist" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.615944 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04a66e39-5c5f-4473-a1af-376694a4f2cf" (UID: "04a66e39-5c5f-4473-a1af-376694a4f2cf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.627631 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04a66e39-5c5f-4473-a1af-376694a4f2cf" (UID: "04a66e39-5c5f-4473-a1af-376694a4f2cf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.644808 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04a66e39-5c5f-4473-a1af-376694a4f2cf" (UID: "04a66e39-5c5f-4473-a1af-376694a4f2cf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.649995 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04a66e39-5c5f-4473-a1af-376694a4f2cf" (UID: "04a66e39-5c5f-4473-a1af-376694a4f2cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.665133 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-config" (OuterVolumeSpecName: "config") pod "04a66e39-5c5f-4473-a1af-376694a4f2cf" (UID: "04a66e39-5c5f-4473-a1af-376694a4f2cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.680741 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.680828 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.680847 4636 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.680860 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.680872 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a66e39-5c5f-4473-a1af-376694a4f2cf-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.815838 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-n62p4"] Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.826629 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-n62p4"] Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.850239 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.901877 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkrxh\" (UniqueName: \"kubernetes.io/projected/7f5bfa77-9c30-4f65-900b-62595074b467-kube-api-access-nkrxh\") pod \"7f5bfa77-9c30-4f65-900b-62595074b467\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.901945 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-config-data\") pod \"7f5bfa77-9c30-4f65-900b-62595074b467\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.902088 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-scripts\") pod \"7f5bfa77-9c30-4f65-900b-62595074b467\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.902166 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-combined-ca-bundle\") pod \"7f5bfa77-9c30-4f65-900b-62595074b467\" (UID: \"7f5bfa77-9c30-4f65-900b-62595074b467\") " Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.910862 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-scripts" (OuterVolumeSpecName: "scripts") pod "7f5bfa77-9c30-4f65-900b-62595074b467" (UID: "7f5bfa77-9c30-4f65-900b-62595074b467"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.911773 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f5bfa77-9c30-4f65-900b-62595074b467-kube-api-access-nkrxh" (OuterVolumeSpecName: "kube-api-access-nkrxh") pod "7f5bfa77-9c30-4f65-900b-62595074b467" (UID: "7f5bfa77-9c30-4f65-900b-62595074b467"). InnerVolumeSpecName "kube-api-access-nkrxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.928028 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-config-data" (OuterVolumeSpecName: "config-data") pod "7f5bfa77-9c30-4f65-900b-62595074b467" (UID: "7f5bfa77-9c30-4f65-900b-62595074b467"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:28 crc kubenswrapper[4636]: I1003 14:22:28.931734 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f5bfa77-9c30-4f65-900b-62595074b467" (UID: "7f5bfa77-9c30-4f65-900b-62595074b467"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.004427 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkrxh\" (UniqueName: \"kubernetes.io/projected/7f5bfa77-9c30-4f65-900b-62595074b467-kube-api-access-nkrxh\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.004465 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.004474 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.004482 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5bfa77-9c30-4f65-900b-62595074b467-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.474112 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9l45j" event={"ID":"7f5bfa77-9c30-4f65-900b-62595074b467","Type":"ContainerDied","Data":"1f4cd6149225356f4f00a12e247d72f920399745272be83c4721cb2264a7e729"} Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.474176 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f4cd6149225356f4f00a12e247d72f920399745272be83c4721cb2264a7e729" Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.474132 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9l45j" Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.563807 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.564279 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerName="nova-api-log" containerID="cri-o://96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46" gracePeriod=30 Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.564713 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerName="nova-api-api" containerID="cri-o://8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7" gracePeriod=30 Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.586806 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:22:29 crc kubenswrapper[4636]: I1003 14:22:29.605272 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:30 crc kubenswrapper[4636]: I1003 14:22:30.488384 4636 generic.go:334] "Generic (PLEG): container finished" podID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerID="96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46" exitCode=143 Oct 03 14:22:30 crc kubenswrapper[4636]: I1003 14:22:30.488471 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5ccdfa8-6bab-4376-8abf-8def038929c5","Type":"ContainerDied","Data":"96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46"} Oct 03 14:22:30 crc kubenswrapper[4636]: I1003 14:22:30.488584 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0b554bdd-2609-4b94-b752-c1380a53cb25" containerName="nova-metadata-log" containerID="cri-o://4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7" gracePeriod=30 Oct 03 14:22:30 crc kubenswrapper[4636]: I1003 14:22:30.489074 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c" containerName="nova-scheduler-scheduler" containerID="cri-o://56303358c54c5d57b132441046c61fcfc6116d91143dd3a53c24c8a7f24ef2b1" gracePeriod=30 Oct 03 14:22:30 crc kubenswrapper[4636]: I1003 14:22:30.489291 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0b554bdd-2609-4b94-b752-c1380a53cb25" containerName="nova-metadata-metadata" containerID="cri-o://71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3" gracePeriod=30 Oct 03 14:22:30 crc kubenswrapper[4636]: I1003 14:22:30.805394 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a66e39-5c5f-4473-a1af-376694a4f2cf" path="/var/lib/kubelet/pods/04a66e39-5c5f-4473-a1af-376694a4f2cf/volumes" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.131431 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.149625 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jx8c\" (UniqueName: \"kubernetes.io/projected/0b554bdd-2609-4b94-b752-c1380a53cb25-kube-api-access-9jx8c\") pod \"0b554bdd-2609-4b94-b752-c1380a53cb25\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.149743 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b554bdd-2609-4b94-b752-c1380a53cb25-logs\") pod \"0b554bdd-2609-4b94-b752-c1380a53cb25\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.149792 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-nova-metadata-tls-certs\") pod \"0b554bdd-2609-4b94-b752-c1380a53cb25\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.149867 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-combined-ca-bundle\") pod \"0b554bdd-2609-4b94-b752-c1380a53cb25\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.150000 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-config-data\") pod \"0b554bdd-2609-4b94-b752-c1380a53cb25\" (UID: \"0b554bdd-2609-4b94-b752-c1380a53cb25\") " Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.155215 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b554bdd-2609-4b94-b752-c1380a53cb25-logs" (OuterVolumeSpecName: "logs") pod "0b554bdd-2609-4b94-b752-c1380a53cb25" (UID: "0b554bdd-2609-4b94-b752-c1380a53cb25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.191501 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b554bdd-2609-4b94-b752-c1380a53cb25-kube-api-access-9jx8c" (OuterVolumeSpecName: "kube-api-access-9jx8c") pod "0b554bdd-2609-4b94-b752-c1380a53cb25" (UID: "0b554bdd-2609-4b94-b752-c1380a53cb25"). InnerVolumeSpecName "kube-api-access-9jx8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.214157 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b554bdd-2609-4b94-b752-c1380a53cb25" (UID: "0b554bdd-2609-4b94-b752-c1380a53cb25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.226408 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-config-data" (OuterVolumeSpecName: "config-data") pod "0b554bdd-2609-4b94-b752-c1380a53cb25" (UID: "0b554bdd-2609-4b94-b752-c1380a53cb25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.255499 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.255540 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.255552 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jx8c\" (UniqueName: \"kubernetes.io/projected/0b554bdd-2609-4b94-b752-c1380a53cb25-kube-api-access-9jx8c\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.255562 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b554bdd-2609-4b94-b752-c1380a53cb25-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.293947 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0b554bdd-2609-4b94-b752-c1380a53cb25" (UID: "0b554bdd-2609-4b94-b752-c1380a53cb25"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.358604 4636 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b554bdd-2609-4b94-b752-c1380a53cb25-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.514347 4636 generic.go:334] "Generic (PLEG): container finished" podID="0b554bdd-2609-4b94-b752-c1380a53cb25" containerID="71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3" exitCode=0 Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.514398 4636 generic.go:334] "Generic (PLEG): container finished" podID="0b554bdd-2609-4b94-b752-c1380a53cb25" containerID="4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7" exitCode=143 Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.514420 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b554bdd-2609-4b94-b752-c1380a53cb25","Type":"ContainerDied","Data":"71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3"} Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.514561 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b554bdd-2609-4b94-b752-c1380a53cb25","Type":"ContainerDied","Data":"4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7"} Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.514574 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b554bdd-2609-4b94-b752-c1380a53cb25","Type":"ContainerDied","Data":"5c60c0efc5dd3813239b75ef2862feedc4bcc18e9e392cd0f074a23b05e2ca5f"} Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.514589 4636 scope.go:117] "RemoveContainer" containerID="71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.514795 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.563209 4636 scope.go:117] "RemoveContainer" containerID="4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.568419 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.591805 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.616079 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:31 crc kubenswrapper[4636]: E1003 14:22:31.616654 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5bfa77-9c30-4f65-900b-62595074b467" containerName="nova-manage" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.616675 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5bfa77-9c30-4f65-900b-62595074b467" containerName="nova-manage" Oct 03 14:22:31 crc kubenswrapper[4636]: E1003 14:22:31.616701 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a66e39-5c5f-4473-a1af-376694a4f2cf" containerName="dnsmasq-dns" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.616712 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a66e39-5c5f-4473-a1af-376694a4f2cf" containerName="dnsmasq-dns" Oct 03 14:22:31 crc kubenswrapper[4636]: E1003 14:22:31.616744 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b554bdd-2609-4b94-b752-c1380a53cb25" containerName="nova-metadata-metadata" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.616753 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b554bdd-2609-4b94-b752-c1380a53cb25" containerName="nova-metadata-metadata" Oct 03 14:22:31 crc kubenswrapper[4636]: E1003 14:22:31.616791 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b554bdd-2609-4b94-b752-c1380a53cb25" containerName="nova-metadata-log" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.616802 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b554bdd-2609-4b94-b752-c1380a53cb25" containerName="nova-metadata-log" Oct 03 14:22:31 crc kubenswrapper[4636]: E1003 14:22:31.616819 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a66e39-5c5f-4473-a1af-376694a4f2cf" containerName="init" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.616826 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a66e39-5c5f-4473-a1af-376694a4f2cf" containerName="init" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.617059 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b554bdd-2609-4b94-b752-c1380a53cb25" containerName="nova-metadata-log" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.617079 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b554bdd-2609-4b94-b752-c1380a53cb25" containerName="nova-metadata-metadata" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.617092 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5bfa77-9c30-4f65-900b-62595074b467" containerName="nova-manage" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.617205 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a66e39-5c5f-4473-a1af-376694a4f2cf" containerName="dnsmasq-dns" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.618556 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.622690 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.623326 4636 scope.go:117] "RemoveContainer" containerID="71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.623827 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 14:22:31 crc kubenswrapper[4636]: E1003 14:22:31.624407 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3\": container with ID starting with 71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3 not found: ID does not exist" containerID="71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.624444 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3"} err="failed to get container status \"71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3\": rpc error: code = NotFound desc = could not find container \"71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3\": container with ID starting with 71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3 not found: ID does not exist" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.624469 4636 scope.go:117] "RemoveContainer" containerID="4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7" Oct 03 14:22:31 crc kubenswrapper[4636]: E1003 14:22:31.624910 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7\": container with ID starting with 4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7 not found: ID does not exist" containerID="4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.624938 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7"} err="failed to get container status \"4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7\": rpc error: code = NotFound desc = could not find container \"4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7\": container with ID starting with 4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7 not found: ID does not exist" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.624959 4636 scope.go:117] "RemoveContainer" containerID="71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.625459 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3"} err="failed to get container status \"71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3\": rpc error: code = NotFound desc = could not find container \"71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3\": container with ID starting with 71541d6ded336be4c3c50f6e08f260f1c04d02407e3b074101cd711c68e5c6b3 not found: ID does not exist" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.625510 4636 scope.go:117] "RemoveContainer" containerID="4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.627567 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7"} err="failed to get container status \"4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7\": rpc error: code = NotFound desc = could not find container \"4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7\": container with ID starting with 4ddde715446b7928218dfce73d2e115cb3014dd980b0b1dc7ebcc6f93b058aa7 not found: ID does not exist" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.629805 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.663816 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-config-data\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.664135 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68tmj\" (UniqueName: \"kubernetes.io/projected/e5d6a8a9-579d-4033-9123-53639ded7cdf-kube-api-access-68tmj\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.664284 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5d6a8a9-579d-4033-9123-53639ded7cdf-logs\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.664517 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.664608 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.767405 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-config-data\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.768125 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68tmj\" (UniqueName: \"kubernetes.io/projected/e5d6a8a9-579d-4033-9123-53639ded7cdf-kube-api-access-68tmj\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.768380 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5d6a8a9-579d-4033-9123-53639ded7cdf-logs\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.768624 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.768772 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.769258 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5d6a8a9-579d-4033-9123-53639ded7cdf-logs\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.772689 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-config-data\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.772991 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.776684 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.789532 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68tmj\" (UniqueName: \"kubernetes.io/projected/e5d6a8a9-579d-4033-9123-53639ded7cdf-kube-api-access-68tmj\") pod \"nova-metadata-0\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " pod="openstack/nova-metadata-0" Oct 03 14:22:31 crc kubenswrapper[4636]: E1003 14:22:31.892374 4636 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56303358c54c5d57b132441046c61fcfc6116d91143dd3a53c24c8a7f24ef2b1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 14:22:31 crc kubenswrapper[4636]: E1003 14:22:31.893999 4636 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56303358c54c5d57b132441046c61fcfc6116d91143dd3a53c24c8a7f24ef2b1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 14:22:31 crc kubenswrapper[4636]: E1003 14:22:31.895434 4636 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56303358c54c5d57b132441046c61fcfc6116d91143dd3a53c24c8a7f24ef2b1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 14:22:31 crc kubenswrapper[4636]: E1003 14:22:31.895473 4636 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c" containerName="nova-scheduler-scheduler" Oct 03 14:22:31 crc kubenswrapper[4636]: I1003 14:22:31.973420 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.433609 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.524652 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5d6a8a9-579d-4033-9123-53639ded7cdf","Type":"ContainerStarted","Data":"e7c15eb643fa4414edc1457ba23ee323a783aa0e9e66276b741fa7bbfff2574d"} Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.526952 4636 generic.go:334] "Generic (PLEG): container finished" podID="60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5" containerID="da82b036efdbc7e73e12a5ebefa44760b78b445f536f231f90c1a47b60593634" exitCode=0 Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.527004 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9b2q2" event={"ID":"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5","Type":"ContainerDied","Data":"da82b036efdbc7e73e12a5ebefa44760b78b445f536f231f90c1a47b60593634"} Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.535142 4636 generic.go:334] "Generic (PLEG): container finished" podID="e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c" containerID="56303358c54c5d57b132441046c61fcfc6116d91143dd3a53c24c8a7f24ef2b1" exitCode=0 Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.535190 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c","Type":"ContainerDied","Data":"56303358c54c5d57b132441046c61fcfc6116d91143dd3a53c24c8a7f24ef2b1"} Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.807029 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b554bdd-2609-4b94-b752-c1380a53cb25" path="/var/lib/kubelet/pods/0b554bdd-2609-4b94-b752-c1380a53cb25/volumes" Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.866433 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.907660 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-combined-ca-bundle\") pod \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.907769 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-config-data\") pod \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.907894 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ld68\" (UniqueName: \"kubernetes.io/projected/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-kube-api-access-7ld68\") pod \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\" (UID: \"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c\") " Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.919846 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-kube-api-access-7ld68" (OuterVolumeSpecName: "kube-api-access-7ld68") pod "e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c" (UID: "e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c"). InnerVolumeSpecName "kube-api-access-7ld68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.961245 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-config-data" (OuterVolumeSpecName: "config-data") pod "e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c" (UID: "e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:32 crc kubenswrapper[4636]: I1003 14:22:32.972821 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c" (UID: "e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.010081 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ld68\" (UniqueName: \"kubernetes.io/projected/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-kube-api-access-7ld68\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.010121 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.010130 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.545674 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.545685 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c","Type":"ContainerDied","Data":"66fe638930f0b7955a7dbdcfe4b10374da2108fe7fd8f7b94376656a9efbaabb"} Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.545771 4636 scope.go:117] "RemoveContainer" containerID="56303358c54c5d57b132441046c61fcfc6116d91143dd3a53c24c8a7f24ef2b1" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.549152 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5d6a8a9-579d-4033-9123-53639ded7cdf","Type":"ContainerStarted","Data":"b8a0fb1d7b200803187cb6a9bef0cf24d19750d6c72cbfd2825c0571e08cbd85"} Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.549633 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5d6a8a9-579d-4033-9123-53639ded7cdf","Type":"ContainerStarted","Data":"b2fe2bb07409d315e531d383bbce7c09ac70821f4d5009db0da4c4bed2468774"} Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.579288 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.579262644 podStartE2EDuration="2.579262644s" podCreationTimestamp="2025-10-03 14:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:33.568361862 +0000 UTC m=+1303.427088119" watchObservedRunningTime="2025-10-03 14:22:33.579262644 +0000 UTC m=+1303.437988891" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.595550 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.608853 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.618054 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:22:33 crc kubenswrapper[4636]: E1003 14:22:33.618531 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c" containerName="nova-scheduler-scheduler" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.618582 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c" containerName="nova-scheduler-scheduler" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.618826 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c" containerName="nova-scheduler-scheduler" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.620032 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.624369 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.628533 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.725044 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.725260 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mwv\" (UniqueName: \"kubernetes.io/projected/94a15f16-0baf-43de-a850-b7a97bac0c65-kube-api-access-55mwv\") pod \"nova-scheduler-0\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.725359 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-config-data\") pod \"nova-scheduler-0\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.826362 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.826704 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55mwv\" (UniqueName: \"kubernetes.io/projected/94a15f16-0baf-43de-a850-b7a97bac0c65-kube-api-access-55mwv\") pod \"nova-scheduler-0\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.826782 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-config-data\") pod \"nova-scheduler-0\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.830626 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.839816 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-config-data\") pod \"nova-scheduler-0\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.844754 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mwv\" (UniqueName: \"kubernetes.io/projected/94a15f16-0baf-43de-a850-b7a97bac0c65-kube-api-access-55mwv\") pod \"nova-scheduler-0\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " pod="openstack/nova-scheduler-0" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.922267 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:33 crc kubenswrapper[4636]: I1003 14:22:33.973581 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.029844 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-scripts\") pod \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.029908 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gtj2\" (UniqueName: \"kubernetes.io/projected/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-kube-api-access-9gtj2\") pod \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.030062 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-combined-ca-bundle\") pod \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.030082 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-config-data\") pod \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\" (UID: \"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5\") " Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.036712 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-kube-api-access-9gtj2" (OuterVolumeSpecName: "kube-api-access-9gtj2") pod "60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5" (UID: "60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5"). InnerVolumeSpecName "kube-api-access-9gtj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.051257 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-scripts" (OuterVolumeSpecName: "scripts") pod "60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5" (UID: "60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.070555 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-config-data" (OuterVolumeSpecName: "config-data") pod "60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5" (UID: "60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.086163 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5" (UID: "60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.132142 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.132177 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gtj2\" (UniqueName: \"kubernetes.io/projected/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-kube-api-access-9gtj2\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.132187 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.132222 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.561482 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9b2q2" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.561486 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9b2q2" event={"ID":"60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5","Type":"ContainerDied","Data":"015679a27ba3199cca1f6f7b0c6d7363dec01238fc4847f08a1a2b033c044c80"} Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.561570 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="015679a27ba3199cca1f6f7b0c6d7363dec01238fc4847f08a1a2b033c044c80" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.669402 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:22:34 crc kubenswrapper[4636]: E1003 14:22:34.669960 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5" containerName="nova-cell1-conductor-db-sync" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.669977 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5" containerName="nova-cell1-conductor-db-sync" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.670283 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5" containerName="nova-cell1-conductor-db-sync" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.671058 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.677413 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.678647 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.758779 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.758884 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.758925 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8stqs\" (UniqueName: \"kubernetes.io/projected/d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d-kube-api-access-8stqs\") pod \"nova-cell1-conductor-0\" (UID: \"d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.804494 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c" path="/var/lib/kubelet/pods/e025b6c1-c1da-4cc7-b7e1-9c271aa45f4c/volumes" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.860623 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.860877 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.860921 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8stqs\" (UniqueName: \"kubernetes.io/projected/d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d-kube-api-access-8stqs\") pod \"nova-cell1-conductor-0\" (UID: \"d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.866126 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.866127 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.883994 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8stqs\" (UniqueName: \"kubernetes.io/projected/d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d-kube-api-access-8stqs\") pod \"nova-cell1-conductor-0\" (UID: \"d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d\") " pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:34 crc kubenswrapper[4636]: I1003 14:22:34.995019 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.158549 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.419600 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.474609 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.477082 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5ccdfa8-6bab-4376-8abf-8def038929c5-logs\") pod \"b5ccdfa8-6bab-4376-8abf-8def038929c5\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.477141 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjkcl\" (UniqueName: \"kubernetes.io/projected/b5ccdfa8-6bab-4376-8abf-8def038929c5-kube-api-access-pjkcl\") pod \"b5ccdfa8-6bab-4376-8abf-8def038929c5\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.477227 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-combined-ca-bundle\") pod \"b5ccdfa8-6bab-4376-8abf-8def038929c5\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.477318 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-config-data\") pod \"b5ccdfa8-6bab-4376-8abf-8def038929c5\" (UID: \"b5ccdfa8-6bab-4376-8abf-8def038929c5\") " Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.477681 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5ccdfa8-6bab-4376-8abf-8def038929c5-logs" (OuterVolumeSpecName: "logs") pod "b5ccdfa8-6bab-4376-8abf-8def038929c5" (UID: "b5ccdfa8-6bab-4376-8abf-8def038929c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.484446 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ccdfa8-6bab-4376-8abf-8def038929c5-kube-api-access-pjkcl" (OuterVolumeSpecName: "kube-api-access-pjkcl") pod "b5ccdfa8-6bab-4376-8abf-8def038929c5" (UID: "b5ccdfa8-6bab-4376-8abf-8def038929c5"). InnerVolumeSpecName "kube-api-access-pjkcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.505363 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-config-data" (OuterVolumeSpecName: "config-data") pod "b5ccdfa8-6bab-4376-8abf-8def038929c5" (UID: "b5ccdfa8-6bab-4376-8abf-8def038929c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.508298 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5ccdfa8-6bab-4376-8abf-8def038929c5" (UID: "b5ccdfa8-6bab-4376-8abf-8def038929c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.575427 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d","Type":"ContainerStarted","Data":"667c5b41e0210ae2fe46bea12cf74f003b727f180160f1090c05621a9bdb7481"} Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.577217 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"94a15f16-0baf-43de-a850-b7a97bac0c65","Type":"ContainerStarted","Data":"2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d"} Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.577293 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"94a15f16-0baf-43de-a850-b7a97bac0c65","Type":"ContainerStarted","Data":"bc467e470da40b309ed4432676f9b7d1b1fc9a5ee44dc9ee6993e001fb7c416f"} Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.579498 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.579572 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5ccdfa8-6bab-4376-8abf-8def038929c5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.579622 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjkcl\" (UniqueName: \"kubernetes.io/projected/b5ccdfa8-6bab-4376-8abf-8def038929c5-kube-api-access-pjkcl\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.579692 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ccdfa8-6bab-4376-8abf-8def038929c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.589298 4636 generic.go:334] "Generic (PLEG): container finished" podID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerID="8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7" exitCode=0 Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.589364 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5ccdfa8-6bab-4376-8abf-8def038929c5","Type":"ContainerDied","Data":"8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7"} Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.589391 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5ccdfa8-6bab-4376-8abf-8def038929c5","Type":"ContainerDied","Data":"41b4ff012e97a3dc06e7b5adeff0058d7cf519116e1b9c66604c59a2168fbc75"} Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.589639 4636 scope.go:117] "RemoveContainer" containerID="8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.589856 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.598681 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.598664883 podStartE2EDuration="2.598664883s" podCreationTimestamp="2025-10-03 14:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:35.594527746 +0000 UTC m=+1305.453253983" watchObservedRunningTime="2025-10-03 14:22:35.598664883 +0000 UTC m=+1305.457391130" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.694170 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.704323 4636 scope.go:117] "RemoveContainer" containerID="96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.715609 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.727887 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 14:22:35 crc kubenswrapper[4636]: E1003 14:22:35.728464 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerName="nova-api-api" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.728488 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerName="nova-api-api" Oct 03 14:22:35 crc kubenswrapper[4636]: E1003 14:22:35.728511 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerName="nova-api-log" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.728520 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerName="nova-api-log" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.728724 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerName="nova-api-log" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.728749 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ccdfa8-6bab-4376-8abf-8def038929c5" containerName="nova-api-api" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.729530 4636 scope.go:117] "RemoveContainer" containerID="8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7" Oct 03 14:22:35 crc kubenswrapper[4636]: E1003 14:22:35.729973 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7\": container with ID starting with 8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7 not found: ID does not exist" containerID="8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.730011 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7"} err="failed to get container status \"8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7\": rpc error: code = NotFound desc = could not find container \"8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7\": container with ID starting with 8c4ba55c7ed87d560530a235761e873e6f6ce9c9da37eba639124250ee25a1f7 not found: ID does not exist" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.730037 4636 scope.go:117] "RemoveContainer" containerID="96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.730421 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: E1003 14:22:35.733631 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46\": container with ID starting with 96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46 not found: ID does not exist" containerID="96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.733671 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46"} err="failed to get container status \"96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46\": rpc error: code = NotFound desc = could not find container \"96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46\": container with ID starting with 96645a637af80246c386a3faf6eb0f0418bb82d9fb75e27420668887c7fbba46 not found: ID does not exist" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.734141 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.743643 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.789581 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57eced7b-3ce3-4969-824f-e14d4d2d834f-logs\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.789776 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-config-data\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.789869 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgrwh\" (UniqueName: \"kubernetes.io/projected/57eced7b-3ce3-4969-824f-e14d4d2d834f-kube-api-access-cgrwh\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.790148 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.891464 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57eced7b-3ce3-4969-824f-e14d4d2d834f-logs\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.891598 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-config-data\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.891640 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgrwh\" (UniqueName: \"kubernetes.io/projected/57eced7b-3ce3-4969-824f-e14d4d2d834f-kube-api-access-cgrwh\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.891756 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.892940 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57eced7b-3ce3-4969-824f-e14d4d2d834f-logs\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.902785 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.906716 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-config-data\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:35 crc kubenswrapper[4636]: I1003 14:22:35.907700 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgrwh\" (UniqueName: \"kubernetes.io/projected/57eced7b-3ce3-4969-824f-e14d4d2d834f-kube-api-access-cgrwh\") pod \"nova-api-0\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " pod="openstack/nova-api-0" Oct 03 14:22:36 crc kubenswrapper[4636]: I1003 14:22:36.056333 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:22:36 crc kubenswrapper[4636]: W1003 14:22:36.489783 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57eced7b_3ce3_4969_824f_e14d4d2d834f.slice/crio-c007171cc0268d2d4fa55e98d539dc1f50aaab42cdc900536b8d59d3c59fff9f WatchSource:0}: Error finding container c007171cc0268d2d4fa55e98d539dc1f50aaab42cdc900536b8d59d3c59fff9f: Status 404 returned error can't find the container with id c007171cc0268d2d4fa55e98d539dc1f50aaab42cdc900536b8d59d3c59fff9f Oct 03 14:22:36 crc kubenswrapper[4636]: I1003 14:22:36.492647 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:22:36 crc kubenswrapper[4636]: I1003 14:22:36.600248 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d","Type":"ContainerStarted","Data":"7f3586736df8363182b6a66f90e4eb48b2905007f22b79a0529f522fae968997"} Oct 03 14:22:36 crc kubenswrapper[4636]: I1003 14:22:36.601431 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:36 crc kubenswrapper[4636]: I1003 14:22:36.607974 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57eced7b-3ce3-4969-824f-e14d4d2d834f","Type":"ContainerStarted","Data":"c007171cc0268d2d4fa55e98d539dc1f50aaab42cdc900536b8d59d3c59fff9f"} Oct 03 14:22:36 crc kubenswrapper[4636]: I1003 14:22:36.622931 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.622910094 podStartE2EDuration="2.622910094s" podCreationTimestamp="2025-10-03 14:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:36.617026742 +0000 UTC m=+1306.475752999" watchObservedRunningTime="2025-10-03 14:22:36.622910094 +0000 UTC m=+1306.481636341" Oct 03 14:22:36 crc kubenswrapper[4636]: I1003 14:22:36.805043 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ccdfa8-6bab-4376-8abf-8def038929c5" path="/var/lib/kubelet/pods/b5ccdfa8-6bab-4376-8abf-8def038929c5/volumes" Oct 03 14:22:36 crc kubenswrapper[4636]: I1003 14:22:36.974068 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:22:36 crc kubenswrapper[4636]: I1003 14:22:36.974538 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:22:37 crc kubenswrapper[4636]: I1003 14:22:37.617243 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57eced7b-3ce3-4969-824f-e14d4d2d834f","Type":"ContainerStarted","Data":"c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c"} Oct 03 14:22:37 crc kubenswrapper[4636]: I1003 14:22:37.617288 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57eced7b-3ce3-4969-824f-e14d4d2d834f","Type":"ContainerStarted","Data":"cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169"} Oct 03 14:22:37 crc kubenswrapper[4636]: I1003 14:22:37.639562 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.639541638 podStartE2EDuration="2.639541638s" podCreationTimestamp="2025-10-03 14:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:37.636617512 +0000 UTC m=+1307.495343759" watchObservedRunningTime="2025-10-03 14:22:37.639541638 +0000 UTC m=+1307.498267885" Oct 03 14:22:38 crc kubenswrapper[4636]: I1003 14:22:38.974272 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 14:22:39 crc kubenswrapper[4636]: I1003 14:22:39.163560 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:22:39 crc kubenswrapper[4636]: I1003 14:22:39.163823 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:22:41 crc kubenswrapper[4636]: I1003 14:22:41.974420 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 14:22:41 crc kubenswrapper[4636]: I1003 14:22:41.974684 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 14:22:42 crc kubenswrapper[4636]: I1003 14:22:42.988439 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:22:42 crc kubenswrapper[4636]: I1003 14:22:42.989639 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:22:43 crc kubenswrapper[4636]: I1003 14:22:43.975179 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 14:22:44 crc kubenswrapper[4636]: I1003 14:22:44.002445 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 14:22:44 crc kubenswrapper[4636]: I1003 14:22:44.723242 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 14:22:45 crc kubenswrapper[4636]: I1003 14:22:45.022962 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 14:22:46 crc kubenswrapper[4636]: I1003 14:22:46.057194 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:22:46 crc kubenswrapper[4636]: I1003 14:22:46.058382 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:22:47 crc kubenswrapper[4636]: I1003 14:22:47.139322 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:22:47 crc kubenswrapper[4636]: I1003 14:22:47.139379 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 14:22:51 crc kubenswrapper[4636]: I1003 14:22:51.979722 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 14:22:51 crc kubenswrapper[4636]: I1003 14:22:51.980428 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 14:22:51 crc kubenswrapper[4636]: I1003 14:22:51.984662 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 14:22:51 crc kubenswrapper[4636]: I1003 14:22:51.986367 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.774015 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.795920 4636 generic.go:334] "Generic (PLEG): container finished" podID="0517677b-2391-46df-a1a8-e73266c4e056" containerID="b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f" exitCode=137 Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.795960 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0517677b-2391-46df-a1a8-e73266c4e056","Type":"ContainerDied","Data":"b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f"} Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.795974 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.796325 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0517677b-2391-46df-a1a8-e73266c4e056","Type":"ContainerDied","Data":"3263648b1a69c794a9dd9c805e04d85e9006be8a3fcefe84f1bea57c3ff367ee"} Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.796346 4636 scope.go:117] "RemoveContainer" containerID="b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f" Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.819039 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtgbh\" (UniqueName: \"kubernetes.io/projected/0517677b-2391-46df-a1a8-e73266c4e056-kube-api-access-rtgbh\") pod \"0517677b-2391-46df-a1a8-e73266c4e056\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.819221 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-combined-ca-bundle\") pod \"0517677b-2391-46df-a1a8-e73266c4e056\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.820244 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-config-data\") pod \"0517677b-2391-46df-a1a8-e73266c4e056\" (UID: \"0517677b-2391-46df-a1a8-e73266c4e056\") " Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.824783 4636 scope.go:117] "RemoveContainer" containerID="b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f" Oct 03 14:22:55 crc kubenswrapper[4636]: E1003 14:22:55.825733 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f\": container with ID starting with b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f not found: ID does not exist" containerID="b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f" Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.825784 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f"} err="failed to get container status \"b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f\": rpc error: code = NotFound desc = could not find container \"b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f\": container with ID starting with b61d893798c224e3d9794c4666fa5becf8734d5817ebf83283aa275633d4fa8f not found: ID does not exist" Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.827089 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0517677b-2391-46df-a1a8-e73266c4e056-kube-api-access-rtgbh" (OuterVolumeSpecName: "kube-api-access-rtgbh") pod "0517677b-2391-46df-a1a8-e73266c4e056" (UID: "0517677b-2391-46df-a1a8-e73266c4e056"). InnerVolumeSpecName "kube-api-access-rtgbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.851350 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0517677b-2391-46df-a1a8-e73266c4e056" (UID: "0517677b-2391-46df-a1a8-e73266c4e056"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.854677 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-config-data" (OuterVolumeSpecName: "config-data") pod "0517677b-2391-46df-a1a8-e73266c4e056" (UID: "0517677b-2391-46df-a1a8-e73266c4e056"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.923353 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtgbh\" (UniqueName: \"kubernetes.io/projected/0517677b-2391-46df-a1a8-e73266c4e056-kube-api-access-rtgbh\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.923417 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:55 crc kubenswrapper[4636]: I1003 14:22:55.923426 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0517677b-2391-46df-a1a8-e73266c4e056-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.061782 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.062437 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.064856 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.067606 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.135664 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.146799 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.159423 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:22:56 crc kubenswrapper[4636]: E1003 14:22:56.159880 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0517677b-2391-46df-a1a8-e73266c4e056" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.159901 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="0517677b-2391-46df-a1a8-e73266c4e056" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.160168 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="0517677b-2391-46df-a1a8-e73266c4e056" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.160894 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.162701 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.169771 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.180177 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.184805 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.229834 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.230151 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntjwp\" (UniqueName: \"kubernetes.io/projected/68498e63-11ab-4746-ae7f-01662c1e136f-kube-api-access-ntjwp\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.230275 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.230422 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.230599 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.332271 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjwp\" (UniqueName: \"kubernetes.io/projected/68498e63-11ab-4746-ae7f-01662c1e136f-kube-api-access-ntjwp\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.332342 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.332403 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.332486 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.332518 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.336600 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.336912 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.337570 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.337796 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498e63-11ab-4746-ae7f-01662c1e136f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.352706 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntjwp\" (UniqueName: \"kubernetes.io/projected/68498e63-11ab-4746-ae7f-01662c1e136f-kube-api-access-ntjwp\") pod \"nova-cell1-novncproxy-0\" (UID: \"68498e63-11ab-4746-ae7f-01662c1e136f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.479766 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.804237 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0517677b-2391-46df-a1a8-e73266c4e056" path="/var/lib/kubelet/pods/0517677b-2391-46df-a1a8-e73266c4e056/volumes" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.809645 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.812741 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 14:22:56 crc kubenswrapper[4636]: I1003 14:22:56.990336 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.120849 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s7jpb"] Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.127216 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.137709 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s7jpb"] Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.258964 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rllh\" (UniqueName: \"kubernetes.io/projected/72cdceb9-a893-4565-aa03-d1cbdf9550ae-kube-api-access-7rllh\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.259165 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.259225 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.259370 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-config\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.259677 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.259792 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.362324 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.362404 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-config\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.362547 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.363584 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.363679 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.363764 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.364340 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-config\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.364621 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.364791 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rllh\" (UniqueName: \"kubernetes.io/projected/72cdceb9-a893-4565-aa03-d1cbdf9550ae-kube-api-access-7rllh\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.364960 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.365815 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.383305 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rllh\" (UniqueName: \"kubernetes.io/projected/72cdceb9-a893-4565-aa03-d1cbdf9550ae-kube-api-access-7rllh\") pod \"dnsmasq-dns-89c5cd4d5-s7jpb\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.485421 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.818959 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"68498e63-11ab-4746-ae7f-01662c1e136f","Type":"ContainerStarted","Data":"839bb88596c91234be9dacbf774ef5e13eaa533ebceca250778fd43022406343"} Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.819271 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"68498e63-11ab-4746-ae7f-01662c1e136f","Type":"ContainerStarted","Data":"4eb1cddae50b0ffb1ef28b342620346b6008dcf0a59a3a47d7a2770719168784"} Oct 03 14:22:57 crc kubenswrapper[4636]: I1003 14:22:57.847720 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.847701958 podStartE2EDuration="1.847701958s" podCreationTimestamp="2025-10-03 14:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:57.838762236 +0000 UTC m=+1327.697488493" watchObservedRunningTime="2025-10-03 14:22:57.847701958 +0000 UTC m=+1327.706428205" Oct 03 14:22:58 crc kubenswrapper[4636]: I1003 14:22:58.026564 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s7jpb"] Oct 03 14:22:58 crc kubenswrapper[4636]: I1003 14:22:58.828699 4636 generic.go:334] "Generic (PLEG): container finished" podID="72cdceb9-a893-4565-aa03-d1cbdf9550ae" containerID="54d14b88b8b1875e78386e36d5121c6a0131ee8495352237b79ce9afbcbccd33" exitCode=0 Oct 03 14:22:58 crc kubenswrapper[4636]: I1003 14:22:58.829239 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" event={"ID":"72cdceb9-a893-4565-aa03-d1cbdf9550ae","Type":"ContainerDied","Data":"54d14b88b8b1875e78386e36d5121c6a0131ee8495352237b79ce9afbcbccd33"} Oct 03 14:22:58 crc kubenswrapper[4636]: I1003 14:22:58.829297 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" event={"ID":"72cdceb9-a893-4565-aa03-d1cbdf9550ae","Type":"ContainerStarted","Data":"566f9ecaf5fcd3a0acf3ff1db8fddce2549ed53d3366d5adcff2f06c9a0ea11a"} Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.593770 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.594530 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="ceilometer-central-agent" containerID="cri-o://5fdbce835f4bd9edba3b6583c6d89e3188bf203ad6471b9ff62a7126f26b9527" gracePeriod=30 Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.594614 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="sg-core" containerID="cri-o://383c4d2f0de37f68dbfba57a9e6f2134a2f1e06da2900b25687eb437d9f8e71e" gracePeriod=30 Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.594614 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="ceilometer-notification-agent" containerID="cri-o://c31fea35d7abaa093f313d967f734147cfac64f7e364633847676dfae568815b" gracePeriod=30 Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.594843 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="proxy-httpd" containerID="cri-o://564f47632c5e6bbbcd8cfc439fe5662bcecb6c0f12457b69a6c9058adf9d44e5" gracePeriod=30 Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.797542 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.840495 4636 generic.go:334] "Generic (PLEG): container finished" podID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerID="564f47632c5e6bbbcd8cfc439fe5662bcecb6c0f12457b69a6c9058adf9d44e5" exitCode=0 Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.840531 4636 generic.go:334] "Generic (PLEG): container finished" podID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerID="383c4d2f0de37f68dbfba57a9e6f2134a2f1e06da2900b25687eb437d9f8e71e" exitCode=2 Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.840574 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03b9cd0b-5397-46c3-af28-f6e766fc596b","Type":"ContainerDied","Data":"564f47632c5e6bbbcd8cfc439fe5662bcecb6c0f12457b69a6c9058adf9d44e5"} Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.840604 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03b9cd0b-5397-46c3-af28-f6e766fc596b","Type":"ContainerDied","Data":"383c4d2f0de37f68dbfba57a9e6f2134a2f1e06da2900b25687eb437d9f8e71e"} Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.842665 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerName="nova-api-log" containerID="cri-o://cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169" gracePeriod=30 Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.843736 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" event={"ID":"72cdceb9-a893-4565-aa03-d1cbdf9550ae","Type":"ContainerStarted","Data":"8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8"} Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.843775 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.844145 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerName="nova-api-api" containerID="cri-o://c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c" gracePeriod=30 Oct 03 14:22:59 crc kubenswrapper[4636]: I1003 14:22:59.879772 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" podStartSLOduration=2.879750844 podStartE2EDuration="2.879750844s" podCreationTimestamp="2025-10-03 14:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:22:59.876532901 +0000 UTC m=+1329.735259168" watchObservedRunningTime="2025-10-03 14:22:59.879750844 +0000 UTC m=+1329.738477091" Oct 03 14:23:00 crc kubenswrapper[4636]: I1003 14:23:00.877677 4636 generic.go:334] "Generic (PLEG): container finished" podID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerID="5fdbce835f4bd9edba3b6583c6d89e3188bf203ad6471b9ff62a7126f26b9527" exitCode=0 Oct 03 14:23:00 crc kubenswrapper[4636]: I1003 14:23:00.877892 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03b9cd0b-5397-46c3-af28-f6e766fc596b","Type":"ContainerDied","Data":"5fdbce835f4bd9edba3b6583c6d89e3188bf203ad6471b9ff62a7126f26b9527"} Oct 03 14:23:00 crc kubenswrapper[4636]: I1003 14:23:00.883980 4636 generic.go:334] "Generic (PLEG): container finished" podID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerID="cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169" exitCode=143 Oct 03 14:23:00 crc kubenswrapper[4636]: I1003 14:23:00.885159 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57eced7b-3ce3-4969-824f-e14d4d2d834f","Type":"ContainerDied","Data":"cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169"} Oct 03 14:23:01 crc kubenswrapper[4636]: I1003 14:23:01.480353 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.422170 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.592611 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgrwh\" (UniqueName: \"kubernetes.io/projected/57eced7b-3ce3-4969-824f-e14d4d2d834f-kube-api-access-cgrwh\") pod \"57eced7b-3ce3-4969-824f-e14d4d2d834f\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.592914 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-config-data\") pod \"57eced7b-3ce3-4969-824f-e14d4d2d834f\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.592959 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-combined-ca-bundle\") pod \"57eced7b-3ce3-4969-824f-e14d4d2d834f\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.593026 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57eced7b-3ce3-4969-824f-e14d4d2d834f-logs\") pod \"57eced7b-3ce3-4969-824f-e14d4d2d834f\" (UID: \"57eced7b-3ce3-4969-824f-e14d4d2d834f\") " Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.593862 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57eced7b-3ce3-4969-824f-e14d4d2d834f-logs" (OuterVolumeSpecName: "logs") pod "57eced7b-3ce3-4969-824f-e14d4d2d834f" (UID: "57eced7b-3ce3-4969-824f-e14d4d2d834f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.599012 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57eced7b-3ce3-4969-824f-e14d4d2d834f-kube-api-access-cgrwh" (OuterVolumeSpecName: "kube-api-access-cgrwh") pod "57eced7b-3ce3-4969-824f-e14d4d2d834f" (UID: "57eced7b-3ce3-4969-824f-e14d4d2d834f"). InnerVolumeSpecName "kube-api-access-cgrwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.627502 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-config-data" (OuterVolumeSpecName: "config-data") pod "57eced7b-3ce3-4969-824f-e14d4d2d834f" (UID: "57eced7b-3ce3-4969-824f-e14d4d2d834f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.644366 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57eced7b-3ce3-4969-824f-e14d4d2d834f" (UID: "57eced7b-3ce3-4969-824f-e14d4d2d834f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.695029 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgrwh\" (UniqueName: \"kubernetes.io/projected/57eced7b-3ce3-4969-824f-e14d4d2d834f-kube-api-access-cgrwh\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.695060 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.695070 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57eced7b-3ce3-4969-824f-e14d4d2d834f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.695080 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57eced7b-3ce3-4969-824f-e14d4d2d834f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.910157 4636 generic.go:334] "Generic (PLEG): container finished" podID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerID="c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c" exitCode=0 Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.910215 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.910517 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57eced7b-3ce3-4969-824f-e14d4d2d834f","Type":"ContainerDied","Data":"c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c"} Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.910628 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57eced7b-3ce3-4969-824f-e14d4d2d834f","Type":"ContainerDied","Data":"c007171cc0268d2d4fa55e98d539dc1f50aaab42cdc900536b8d59d3c59fff9f"} Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.910693 4636 scope.go:117] "RemoveContainer" containerID="c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.936982 4636 scope.go:117] "RemoveContainer" containerID="cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169" Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.943548 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:23:03 crc kubenswrapper[4636]: I1003 14:23:03.993223 4636 scope.go:117] "RemoveContainer" containerID="c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c" Oct 03 14:23:04 crc kubenswrapper[4636]: E1003 14:23:04.007632 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c\": container with ID starting with c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c not found: ID does not exist" containerID="c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.007682 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c"} err="failed to get container status \"c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c\": rpc error: code = NotFound desc = could not find container \"c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c\": container with ID starting with c6412205edcb268969edbeab3d70cbc816a95268f563a2f61c61f953f625669c not found: ID does not exist" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.007720 4636 scope.go:117] "RemoveContainer" containerID="cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.007795 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:23:04 crc kubenswrapper[4636]: E1003 14:23:04.013467 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169\": container with ID starting with cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169 not found: ID does not exist" containerID="cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.013522 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169"} err="failed to get container status \"cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169\": rpc error: code = NotFound desc = could not find container \"cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169\": container with ID starting with cf71903b962326166bc6366a5d8fb85a44e02407045cc3fad582fef5e3df9169 not found: ID does not exist" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.060160 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 14:23:04 crc kubenswrapper[4636]: E1003 14:23:04.060960 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerName="nova-api-log" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.060978 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerName="nova-api-log" Oct 03 14:23:04 crc kubenswrapper[4636]: E1003 14:23:04.061014 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerName="nova-api-api" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.061020 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerName="nova-api-api" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.061347 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerName="nova-api-log" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.061371 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="57eced7b-3ce3-4969-824f-e14d4d2d834f" containerName="nova-api-api" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.062810 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.070448 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.070685 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.070888 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.119066 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.225675 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed7be015-ebed-48f9-b88b-4fae5001511c-logs\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.225768 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.225848 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xdlq\" (UniqueName: \"kubernetes.io/projected/ed7be015-ebed-48f9-b88b-4fae5001511c-kube-api-access-6xdlq\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.225866 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.225909 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-public-tls-certs\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.225943 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-config-data\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.327721 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed7be015-ebed-48f9-b88b-4fae5001511c-logs\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.327794 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.327858 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xdlq\" (UniqueName: \"kubernetes.io/projected/ed7be015-ebed-48f9-b88b-4fae5001511c-kube-api-access-6xdlq\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.327877 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.327920 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-public-tls-certs\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.327954 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-config-data\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.328324 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed7be015-ebed-48f9-b88b-4fae5001511c-logs\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.332179 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-public-tls-certs\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.332785 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.335717 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.336007 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-config-data\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.347972 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xdlq\" (UniqueName: \"kubernetes.io/projected/ed7be015-ebed-48f9-b88b-4fae5001511c-kube-api-access-6xdlq\") pod \"nova-api-0\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.423598 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.803420 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57eced7b-3ce3-4969-824f-e14d4d2d834f" path="/var/lib/kubelet/pods/57eced7b-3ce3-4969-824f-e14d4d2d834f/volumes" Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.893211 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:23:04 crc kubenswrapper[4636]: I1003 14:23:04.919696 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed7be015-ebed-48f9-b88b-4fae5001511c","Type":"ContainerStarted","Data":"b7c2b9810070a8335c9f3e756ab07014c1ace577cfa89849c135621b60c4aacc"} Oct 03 14:23:05 crc kubenswrapper[4636]: I1003 14:23:05.931698 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed7be015-ebed-48f9-b88b-4fae5001511c","Type":"ContainerStarted","Data":"f68e0284fb2d29216433a2429b1d2698059756deb224a76feb6646a9c745b834"} Oct 03 14:23:05 crc kubenswrapper[4636]: I1003 14:23:05.933166 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed7be015-ebed-48f9-b88b-4fae5001511c","Type":"ContainerStarted","Data":"ffafdd9752b894ef9001f94845b20b8ae023a574abdb4da1b45b85fbc40d384d"} Oct 03 14:23:05 crc kubenswrapper[4636]: I1003 14:23:05.950925 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.950908748 podStartE2EDuration="2.950908748s" podCreationTimestamp="2025-10-03 14:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:05.949609394 +0000 UTC m=+1335.808335641" watchObservedRunningTime="2025-10-03 14:23:05.950908748 +0000 UTC m=+1335.809634995" Oct 03 14:23:06 crc kubenswrapper[4636]: I1003 14:23:06.481054 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:23:06 crc kubenswrapper[4636]: I1003 14:23:06.499803 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:23:06 crc kubenswrapper[4636]: I1003 14:23:06.944677 4636 generic.go:334] "Generic (PLEG): container finished" podID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerID="c31fea35d7abaa093f313d967f734147cfac64f7e364633847676dfae568815b" exitCode=0 Oct 03 14:23:06 crc kubenswrapper[4636]: I1003 14:23:06.945891 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03b9cd0b-5397-46c3-af28-f6e766fc596b","Type":"ContainerDied","Data":"c31fea35d7abaa093f313d967f734147cfac64f7e364633847676dfae568815b"} Oct 03 14:23:06 crc kubenswrapper[4636]: I1003 14:23:06.960985 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.136714 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-f4zgp"] Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.138340 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.140829 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.141087 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.150246 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-f4zgp"] Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.286632 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.286734 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xkrd\" (UniqueName: \"kubernetes.io/projected/78747294-969f-4563-9d83-19f46b0045aa-kube-api-access-5xkrd\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.286792 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-scripts\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.286856 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-config-data\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.298235 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.389249 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.389321 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xkrd\" (UniqueName: \"kubernetes.io/projected/78747294-969f-4563-9d83-19f46b0045aa-kube-api-access-5xkrd\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.389366 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-scripts\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.389416 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-config-data\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.396591 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.410812 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-scripts\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.416847 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xkrd\" (UniqueName: \"kubernetes.io/projected/78747294-969f-4563-9d83-19f46b0045aa-kube-api-access-5xkrd\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.435954 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-config-data\") pod \"nova-cell1-cell-mapping-f4zgp\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.487341 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.490421 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-scripts\") pod \"03b9cd0b-5397-46c3-af28-f6e766fc596b\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.491447 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-ceilometer-tls-certs\") pod \"03b9cd0b-5397-46c3-af28-f6e766fc596b\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.491494 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-sg-core-conf-yaml\") pod \"03b9cd0b-5397-46c3-af28-f6e766fc596b\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.491804 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-config-data\") pod \"03b9cd0b-5397-46c3-af28-f6e766fc596b\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.491840 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-log-httpd\") pod \"03b9cd0b-5397-46c3-af28-f6e766fc596b\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.491988 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-combined-ca-bundle\") pod \"03b9cd0b-5397-46c3-af28-f6e766fc596b\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.492018 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-run-httpd\") pod \"03b9cd0b-5397-46c3-af28-f6e766fc596b\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.492042 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wncbf\" (UniqueName: \"kubernetes.io/projected/03b9cd0b-5397-46c3-af28-f6e766fc596b-kube-api-access-wncbf\") pod \"03b9cd0b-5397-46c3-af28-f6e766fc596b\" (UID: \"03b9cd0b-5397-46c3-af28-f6e766fc596b\") " Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.494083 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-scripts" (OuterVolumeSpecName: "scripts") pod "03b9cd0b-5397-46c3-af28-f6e766fc596b" (UID: "03b9cd0b-5397-46c3-af28-f6e766fc596b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.494829 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "03b9cd0b-5397-46c3-af28-f6e766fc596b" (UID: "03b9cd0b-5397-46c3-af28-f6e766fc596b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.496296 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "03b9cd0b-5397-46c3-af28-f6e766fc596b" (UID: "03b9cd0b-5397-46c3-af28-f6e766fc596b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.496822 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b9cd0b-5397-46c3-af28-f6e766fc596b-kube-api-access-wncbf" (OuterVolumeSpecName: "kube-api-access-wncbf") pod "03b9cd0b-5397-46c3-af28-f6e766fc596b" (UID: "03b9cd0b-5397-46c3-af28-f6e766fc596b"). InnerVolumeSpecName "kube-api-access-wncbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.501238 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.585123 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-xcbk7"] Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.587134 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" podUID="83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" containerName="dnsmasq-dns" containerID="cri-o://185b5522a1d25463a32210cfa6011ad870d1f668cd64096f563d9f3a75c7dd92" gracePeriod=10 Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.591772 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "03b9cd0b-5397-46c3-af28-f6e766fc596b" (UID: "03b9cd0b-5397-46c3-af28-f6e766fc596b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.604298 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wncbf\" (UniqueName: \"kubernetes.io/projected/03b9cd0b-5397-46c3-af28-f6e766fc596b-kube-api-access-wncbf\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.604319 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.604329 4636 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.604338 4636 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.604346 4636 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03b9cd0b-5397-46c3-af28-f6e766fc596b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.654605 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "03b9cd0b-5397-46c3-af28-f6e766fc596b" (UID: "03b9cd0b-5397-46c3-af28-f6e766fc596b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.654705 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03b9cd0b-5397-46c3-af28-f6e766fc596b" (UID: "03b9cd0b-5397-46c3-af28-f6e766fc596b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.706797 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.706827 4636 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.771220 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-config-data" (OuterVolumeSpecName: "config-data") pod "03b9cd0b-5397-46c3-af28-f6e766fc596b" (UID: "03b9cd0b-5397-46c3-af28-f6e766fc596b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.808675 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b9cd0b-5397-46c3-af28-f6e766fc596b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.958354 4636 generic.go:334] "Generic (PLEG): container finished" podID="83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" containerID="185b5522a1d25463a32210cfa6011ad870d1f668cd64096f563d9f3a75c7dd92" exitCode=0 Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.958412 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" event={"ID":"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9","Type":"ContainerDied","Data":"185b5522a1d25463a32210cfa6011ad870d1f668cd64096f563d9f3a75c7dd92"} Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.960796 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03b9cd0b-5397-46c3-af28-f6e766fc596b","Type":"ContainerDied","Data":"7b6af38c7b40ebd6ba67b770af8702bb27fd394bd7ea17c4c363b599efbbd055"} Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.960854 4636 scope.go:117] "RemoveContainer" containerID="564f47632c5e6bbbcd8cfc439fe5662bcecb6c0f12457b69a6c9058adf9d44e5" Oct 03 14:23:07 crc kubenswrapper[4636]: I1003 14:23:07.961175 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.037221 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-f4zgp"] Oct 03 14:23:08 crc kubenswrapper[4636]: W1003 14:23:08.059730 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78747294_969f_4563_9d83_19f46b0045aa.slice/crio-6cf8c5be33f922b25fe96f2db13decf6af3c515c697ace43c4a005be1c962cc6 WatchSource:0}: Error finding container 6cf8c5be33f922b25fe96f2db13decf6af3c515c697ace43c4a005be1c962cc6: Status 404 returned error can't find the container with id 6cf8c5be33f922b25fe96f2db13decf6af3c515c697ace43c4a005be1c962cc6 Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.077458 4636 scope.go:117] "RemoveContainer" containerID="383c4d2f0de37f68dbfba57a9e6f2134a2f1e06da2900b25687eb437d9f8e71e" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.179237 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.188338 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.193432 4636 scope.go:117] "RemoveContainer" containerID="c31fea35d7abaa093f313d967f734147cfac64f7e364633847676dfae568815b" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.199185 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.222013 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:23:08 crc kubenswrapper[4636]: E1003 14:23:08.222759 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" containerName="dnsmasq-dns" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.223512 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" containerName="dnsmasq-dns" Oct 03 14:23:08 crc kubenswrapper[4636]: E1003 14:23:08.223656 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="ceilometer-notification-agent" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.223752 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="ceilometer-notification-agent" Oct 03 14:23:08 crc kubenswrapper[4636]: E1003 14:23:08.223856 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" containerName="init" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.223952 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" containerName="init" Oct 03 14:23:08 crc kubenswrapper[4636]: E1003 14:23:08.224048 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="sg-core" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.224147 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="sg-core" Oct 03 14:23:08 crc kubenswrapper[4636]: E1003 14:23:08.224279 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="proxy-httpd" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.224375 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="proxy-httpd" Oct 03 14:23:08 crc kubenswrapper[4636]: E1003 14:23:08.224508 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="ceilometer-central-agent" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.224605 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="ceilometer-central-agent" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.229197 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="proxy-httpd" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.229350 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="sg-core" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.229505 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="ceilometer-notification-agent" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.229907 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" containerName="ceilometer-central-agent" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.230011 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" containerName="dnsmasq-dns" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.233994 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.241834 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.241916 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.242123 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.245249 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.295928 4636 scope.go:117] "RemoveContainer" containerID="5fdbce835f4bd9edba3b6583c6d89e3188bf203ad6471b9ff62a7126f26b9527" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321049 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-svc\") pod \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321190 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-nb\") pod \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321230 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-sb\") pod \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321270 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-config\") pod \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321329 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-872nj\" (UniqueName: \"kubernetes.io/projected/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-kube-api-access-872nj\") pod \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321423 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-swift-storage-0\") pod \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\" (UID: \"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9\") " Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321627 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321652 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321682 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-run-httpd\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321726 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-scripts\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321761 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxc7g\" (UniqueName: \"kubernetes.io/projected/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-kube-api-access-dxc7g\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321797 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-log-httpd\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321843 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-config-data\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.321867 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.338020 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-kube-api-access-872nj" (OuterVolumeSpecName: "kube-api-access-872nj") pod "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" (UID: "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9"). InnerVolumeSpecName "kube-api-access-872nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.423038 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-config-data\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.423079 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.423151 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.423167 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.423191 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-run-httpd\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.423231 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-scripts\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.423262 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxc7g\" (UniqueName: \"kubernetes.io/projected/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-kube-api-access-dxc7g\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.423295 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-log-httpd\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.423348 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-872nj\" (UniqueName: \"kubernetes.io/projected/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-kube-api-access-872nj\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.423661 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-log-httpd\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.431340 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.432394 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-run-httpd\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.432959 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" (UID: "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.432987 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" (UID: "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.433363 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.436208 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-scripts\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.440651 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" (UID: "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.442999 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-config-data\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.452144 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.454601 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxc7g\" (UniqueName: \"kubernetes.io/projected/ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e-kube-api-access-dxc7g\") pod \"ceilometer-0\" (UID: \"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e\") " pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.482813 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-config" (OuterVolumeSpecName: "config") pod "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" (UID: "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.483648 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" (UID: "83ea03ed-bc48-4baf-a2a0-89af7e25e5b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.526226 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.526261 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.526271 4636 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.526280 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.526288 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.603362 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.806314 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b9cd0b-5397-46c3-af28-f6e766fc596b" path="/var/lib/kubelet/pods/03b9cd0b-5397-46c3-af28-f6e766fc596b/volumes" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.975199 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" event={"ID":"83ea03ed-bc48-4baf-a2a0-89af7e25e5b9","Type":"ContainerDied","Data":"4b12f278d2381a1bef3dd17586469d8ab1a99e3f6e172e627706901d8e21a8cb"} Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.975387 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-xcbk7" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.976456 4636 scope.go:117] "RemoveContainer" containerID="185b5522a1d25463a32210cfa6011ad870d1f668cd64096f563d9f3a75c7dd92" Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.989243 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f4zgp" event={"ID":"78747294-969f-4563-9d83-19f46b0045aa","Type":"ContainerStarted","Data":"5a713a957138047512d15df01d3093a25f5a18668f9494292c1a3c8e43d40cea"} Oct 03 14:23:08 crc kubenswrapper[4636]: I1003 14:23:08.989392 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f4zgp" event={"ID":"78747294-969f-4563-9d83-19f46b0045aa","Type":"ContainerStarted","Data":"6cf8c5be33f922b25fe96f2db13decf6af3c515c697ace43c4a005be1c962cc6"} Oct 03 14:23:09 crc kubenswrapper[4636]: I1003 14:23:09.019496 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-f4zgp" podStartSLOduration=2.019476132 podStartE2EDuration="2.019476132s" podCreationTimestamp="2025-10-03 14:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:09.01165288 +0000 UTC m=+1338.870379127" watchObservedRunningTime="2025-10-03 14:23:09.019476132 +0000 UTC m=+1338.878202379" Oct 03 14:23:09 crc kubenswrapper[4636]: I1003 14:23:09.043952 4636 scope.go:117] "RemoveContainer" containerID="2edcfcda25643af1c3527e1726f933daf77199b509187f8cfd0f5ea5b52ff180" Oct 03 14:23:09 crc kubenswrapper[4636]: I1003 14:23:09.054143 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-xcbk7"] Oct 03 14:23:09 crc kubenswrapper[4636]: I1003 14:23:09.061793 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-xcbk7"] Oct 03 14:23:09 crc kubenswrapper[4636]: I1003 14:23:09.118192 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 14:23:09 crc kubenswrapper[4636]: I1003 14:23:09.162687 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:23:09 crc kubenswrapper[4636]: I1003 14:23:09.162741 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:23:10 crc kubenswrapper[4636]: I1003 14:23:10.006176 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e","Type":"ContainerStarted","Data":"b09c5b6c29bd01ad92fa86303a66ee3c663f314bc27b8ed781af9523bfb32bdc"} Oct 03 14:23:10 crc kubenswrapper[4636]: I1003 14:23:10.006746 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e","Type":"ContainerStarted","Data":"d9f65099e200f153614365445e868e4b32c01ca7873f7c9c7a17ade1c66b0bb8"} Oct 03 14:23:10 crc kubenswrapper[4636]: I1003 14:23:10.802604 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83ea03ed-bc48-4baf-a2a0-89af7e25e5b9" path="/var/lib/kubelet/pods/83ea03ed-bc48-4baf-a2a0-89af7e25e5b9/volumes" Oct 03 14:23:11 crc kubenswrapper[4636]: I1003 14:23:11.014209 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e","Type":"ContainerStarted","Data":"448ee480536f8685112146e1c54ba5f9921846425d97295bb15b60234a1c81c7"} Oct 03 14:23:12 crc kubenswrapper[4636]: I1003 14:23:12.024994 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e","Type":"ContainerStarted","Data":"a1bfcbeaf015410ebd4a073450159cdcdda8dae82aae840e9a33f0a6e6e049e6"} Oct 03 14:23:14 crc kubenswrapper[4636]: I1003 14:23:14.047617 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e","Type":"ContainerStarted","Data":"f6df08dafbfc16b032429167e9aa43fe2e05db729a4ba603c2a3c0dbfea5f523"} Oct 03 14:23:14 crc kubenswrapper[4636]: I1003 14:23:14.048581 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 14:23:14 crc kubenswrapper[4636]: I1003 14:23:14.050394 4636 generic.go:334] "Generic (PLEG): container finished" podID="78747294-969f-4563-9d83-19f46b0045aa" containerID="5a713a957138047512d15df01d3093a25f5a18668f9494292c1a3c8e43d40cea" exitCode=0 Oct 03 14:23:14 crc kubenswrapper[4636]: I1003 14:23:14.050420 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f4zgp" event={"ID":"78747294-969f-4563-9d83-19f46b0045aa","Type":"ContainerDied","Data":"5a713a957138047512d15df01d3093a25f5a18668f9494292c1a3c8e43d40cea"} Oct 03 14:23:14 crc kubenswrapper[4636]: I1003 14:23:14.067182 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.201154482 podStartE2EDuration="6.067162515s" podCreationTimestamp="2025-10-03 14:23:08 +0000 UTC" firstStartedPulling="2025-10-03 14:23:09.147968813 +0000 UTC m=+1339.006695060" lastFinishedPulling="2025-10-03 14:23:13.013976846 +0000 UTC m=+1342.872703093" observedRunningTime="2025-10-03 14:23:14.065577224 +0000 UTC m=+1343.924303471" watchObservedRunningTime="2025-10-03 14:23:14.067162515 +0000 UTC m=+1343.925888762" Oct 03 14:23:14 crc kubenswrapper[4636]: I1003 14:23:14.424634 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:23:14 crc kubenswrapper[4636]: I1003 14:23:14.424676 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.432291 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.438303 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.438325 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.558837 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-config-data\") pod \"78747294-969f-4563-9d83-19f46b0045aa\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.558975 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xkrd\" (UniqueName: \"kubernetes.io/projected/78747294-969f-4563-9d83-19f46b0045aa-kube-api-access-5xkrd\") pod \"78747294-969f-4563-9d83-19f46b0045aa\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.559790 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-scripts\") pod \"78747294-969f-4563-9d83-19f46b0045aa\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.559873 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-combined-ca-bundle\") pod \"78747294-969f-4563-9d83-19f46b0045aa\" (UID: \"78747294-969f-4563-9d83-19f46b0045aa\") " Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.564750 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-scripts" (OuterVolumeSpecName: "scripts") pod "78747294-969f-4563-9d83-19f46b0045aa" (UID: "78747294-969f-4563-9d83-19f46b0045aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.578567 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78747294-969f-4563-9d83-19f46b0045aa-kube-api-access-5xkrd" (OuterVolumeSpecName: "kube-api-access-5xkrd") pod "78747294-969f-4563-9d83-19f46b0045aa" (UID: "78747294-969f-4563-9d83-19f46b0045aa"). InnerVolumeSpecName "kube-api-access-5xkrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.588988 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78747294-969f-4563-9d83-19f46b0045aa" (UID: "78747294-969f-4563-9d83-19f46b0045aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.594919 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-config-data" (OuterVolumeSpecName: "config-data") pod "78747294-969f-4563-9d83-19f46b0045aa" (UID: "78747294-969f-4563-9d83-19f46b0045aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.662264 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.662465 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xkrd\" (UniqueName: \"kubernetes.io/projected/78747294-969f-4563-9d83-19f46b0045aa-kube-api-access-5xkrd\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.662523 4636 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:15 crc kubenswrapper[4636]: I1003 14:23:15.662575 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78747294-969f-4563-9d83-19f46b0045aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:16 crc kubenswrapper[4636]: I1003 14:23:16.068697 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f4zgp" event={"ID":"78747294-969f-4563-9d83-19f46b0045aa","Type":"ContainerDied","Data":"6cf8c5be33f922b25fe96f2db13decf6af3c515c697ace43c4a005be1c962cc6"} Oct 03 14:23:16 crc kubenswrapper[4636]: I1003 14:23:16.069017 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cf8c5be33f922b25fe96f2db13decf6af3c515c697ace43c4a005be1c962cc6" Oct 03 14:23:16 crc kubenswrapper[4636]: I1003 14:23:16.069083 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f4zgp" Oct 03 14:23:16 crc kubenswrapper[4636]: I1003 14:23:16.270772 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:23:16 crc kubenswrapper[4636]: I1003 14:23:16.271038 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerName="nova-api-log" containerID="cri-o://ffafdd9752b894ef9001f94845b20b8ae023a574abdb4da1b45b85fbc40d384d" gracePeriod=30 Oct 03 14:23:16 crc kubenswrapper[4636]: I1003 14:23:16.271080 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerName="nova-api-api" containerID="cri-o://f68e0284fb2d29216433a2429b1d2698059756deb224a76feb6646a9c745b834" gracePeriod=30 Oct 03 14:23:16 crc kubenswrapper[4636]: I1003 14:23:16.300068 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:23:16 crc kubenswrapper[4636]: I1003 14:23:16.300473 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="94a15f16-0baf-43de-a850-b7a97bac0c65" containerName="nova-scheduler-scheduler" containerID="cri-o://2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d" gracePeriod=30 Oct 03 14:23:16 crc kubenswrapper[4636]: I1003 14:23:16.387649 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:23:16 crc kubenswrapper[4636]: I1003 14:23:16.388154 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-metadata" containerID="cri-o://b8a0fb1d7b200803187cb6a9bef0cf24d19750d6c72cbfd2825c0571e08cbd85" gracePeriod=30 Oct 03 14:23:16 crc kubenswrapper[4636]: I1003 14:23:16.388579 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-log" containerID="cri-o://b2fe2bb07409d315e531d383bbce7c09ac70821f4d5009db0da4c4bed2468774" gracePeriod=30 Oct 03 14:23:17 crc kubenswrapper[4636]: I1003 14:23:17.079832 4636 generic.go:334] "Generic (PLEG): container finished" podID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerID="ffafdd9752b894ef9001f94845b20b8ae023a574abdb4da1b45b85fbc40d384d" exitCode=143 Oct 03 14:23:17 crc kubenswrapper[4636]: I1003 14:23:17.079885 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed7be015-ebed-48f9-b88b-4fae5001511c","Type":"ContainerDied","Data":"ffafdd9752b894ef9001f94845b20b8ae023a574abdb4da1b45b85fbc40d384d"} Oct 03 14:23:17 crc kubenswrapper[4636]: I1003 14:23:17.082203 4636 generic.go:334] "Generic (PLEG): container finished" podID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerID="b2fe2bb07409d315e531d383bbce7c09ac70821f4d5009db0da4c4bed2468774" exitCode=143 Oct 03 14:23:17 crc kubenswrapper[4636]: I1003 14:23:17.082225 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5d6a8a9-579d-4033-9123-53639ded7cdf","Type":"ContainerDied","Data":"b2fe2bb07409d315e531d383bbce7c09ac70821f4d5009db0da4c4bed2468774"} Oct 03 14:23:18 crc kubenswrapper[4636]: E1003 14:23:18.975353 4636 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d is running failed: container process not found" containerID="2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 14:23:18 crc kubenswrapper[4636]: E1003 14:23:18.976807 4636 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d is running failed: container process not found" containerID="2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 14:23:18 crc kubenswrapper[4636]: E1003 14:23:18.977046 4636 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d is running failed: container process not found" containerID="2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 14:23:18 crc kubenswrapper[4636]: E1003 14:23:18.977074 4636 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="94a15f16-0baf-43de-a850-b7a97bac0c65" containerName="nova-scheduler-scheduler" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.035570 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.112405 4636 generic.go:334] "Generic (PLEG): container finished" podID="94a15f16-0baf-43de-a850-b7a97bac0c65" containerID="2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d" exitCode=0 Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.112451 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"94a15f16-0baf-43de-a850-b7a97bac0c65","Type":"ContainerDied","Data":"2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d"} Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.112483 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"94a15f16-0baf-43de-a850-b7a97bac0c65","Type":"ContainerDied","Data":"bc467e470da40b309ed4432676f9b7d1b1fc9a5ee44dc9ee6993e001fb7c416f"} Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.112502 4636 scope.go:117] "RemoveContainer" containerID="2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.112666 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.121874 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-config-data\") pod \"94a15f16-0baf-43de-a850-b7a97bac0c65\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.122116 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55mwv\" (UniqueName: \"kubernetes.io/projected/94a15f16-0baf-43de-a850-b7a97bac0c65-kube-api-access-55mwv\") pod \"94a15f16-0baf-43de-a850-b7a97bac0c65\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.122191 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-combined-ca-bundle\") pod \"94a15f16-0baf-43de-a850-b7a97bac0c65\" (UID: \"94a15f16-0baf-43de-a850-b7a97bac0c65\") " Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.160138 4636 scope.go:117] "RemoveContainer" containerID="2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.165510 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a15f16-0baf-43de-a850-b7a97bac0c65-kube-api-access-55mwv" (OuterVolumeSpecName: "kube-api-access-55mwv") pod "94a15f16-0baf-43de-a850-b7a97bac0c65" (UID: "94a15f16-0baf-43de-a850-b7a97bac0c65"). InnerVolumeSpecName "kube-api-access-55mwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:19 crc kubenswrapper[4636]: E1003 14:23:19.166024 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d\": container with ID starting with 2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d not found: ID does not exist" containerID="2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.166262 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d"} err="failed to get container status \"2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d\": rpc error: code = NotFound desc = could not find container \"2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d\": container with ID starting with 2495872738e3d107c0ffb74ee1686ca4768b10b6e8b51240edff9f3173c8267d not found: ID does not exist" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.185340 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94a15f16-0baf-43de-a850-b7a97bac0c65" (UID: "94a15f16-0baf-43de-a850-b7a97bac0c65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.190906 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-config-data" (OuterVolumeSpecName: "config-data") pod "94a15f16-0baf-43de-a850-b7a97bac0c65" (UID: "94a15f16-0baf-43de-a850-b7a97bac0c65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.224136 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.224309 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55mwv\" (UniqueName: \"kubernetes.io/projected/94a15f16-0baf-43de-a850-b7a97bac0c65-kube-api-access-55mwv\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.224365 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a15f16-0baf-43de-a850-b7a97bac0c65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.447155 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.455874 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.464451 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:23:19 crc kubenswrapper[4636]: E1003 14:23:19.464807 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78747294-969f-4563-9d83-19f46b0045aa" containerName="nova-manage" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.464825 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="78747294-969f-4563-9d83-19f46b0045aa" containerName="nova-manage" Oct 03 14:23:19 crc kubenswrapper[4636]: E1003 14:23:19.464866 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a15f16-0baf-43de-a850-b7a97bac0c65" containerName="nova-scheduler-scheduler" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.464872 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a15f16-0baf-43de-a850-b7a97bac0c65" containerName="nova-scheduler-scheduler" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.465044 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a15f16-0baf-43de-a850-b7a97bac0c65" containerName="nova-scheduler-scheduler" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.465062 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="78747294-969f-4563-9d83-19f46b0045aa" containerName="nova-manage" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.465648 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.469405 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.529300 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493bf5be-a62b-4d5e-8de8-082ab7d23842-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"493bf5be-a62b-4d5e-8de8-082ab7d23842\") " pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.529677 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493bf5be-a62b-4d5e-8de8-082ab7d23842-config-data\") pod \"nova-scheduler-0\" (UID: \"493bf5be-a62b-4d5e-8de8-082ab7d23842\") " pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.529794 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghcjd\" (UniqueName: \"kubernetes.io/projected/493bf5be-a62b-4d5e-8de8-082ab7d23842-kube-api-access-ghcjd\") pod \"nova-scheduler-0\" (UID: \"493bf5be-a62b-4d5e-8de8-082ab7d23842\") " pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.531924 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.631282 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493bf5be-a62b-4d5e-8de8-082ab7d23842-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"493bf5be-a62b-4d5e-8de8-082ab7d23842\") " pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.631640 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493bf5be-a62b-4d5e-8de8-082ab7d23842-config-data\") pod \"nova-scheduler-0\" (UID: \"493bf5be-a62b-4d5e-8de8-082ab7d23842\") " pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.631759 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghcjd\" (UniqueName: \"kubernetes.io/projected/493bf5be-a62b-4d5e-8de8-082ab7d23842-kube-api-access-ghcjd\") pod \"nova-scheduler-0\" (UID: \"493bf5be-a62b-4d5e-8de8-082ab7d23842\") " pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.636844 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493bf5be-a62b-4d5e-8de8-082ab7d23842-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"493bf5be-a62b-4d5e-8de8-082ab7d23842\") " pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.643994 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493bf5be-a62b-4d5e-8de8-082ab7d23842-config-data\") pod \"nova-scheduler-0\" (UID: \"493bf5be-a62b-4d5e-8de8-082ab7d23842\") " pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.673729 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghcjd\" (UniqueName: \"kubernetes.io/projected/493bf5be-a62b-4d5e-8de8-082ab7d23842-kube-api-access-ghcjd\") pod \"nova-scheduler-0\" (UID: \"493bf5be-a62b-4d5e-8de8-082ab7d23842\") " pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.781417 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.841851 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:58496->10.217.0.195:8775: read: connection reset by peer" Oct 03 14:23:19 crc kubenswrapper[4636]: I1003 14:23:19.841883 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:58508->10.217.0.195:8775: read: connection reset by peer" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.123251 4636 generic.go:334] "Generic (PLEG): container finished" podID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerID="b8a0fb1d7b200803187cb6a9bef0cf24d19750d6c72cbfd2825c0571e08cbd85" exitCode=0 Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.123319 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5d6a8a9-579d-4033-9123-53639ded7cdf","Type":"ContainerDied","Data":"b8a0fb1d7b200803187cb6a9bef0cf24d19750d6c72cbfd2825c0571e08cbd85"} Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.251241 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.267497 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.349041 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-config-data\") pod \"e5d6a8a9-579d-4033-9123-53639ded7cdf\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.349207 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5d6a8a9-579d-4033-9123-53639ded7cdf-logs\") pod \"e5d6a8a9-579d-4033-9123-53639ded7cdf\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.349792 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-combined-ca-bundle\") pod \"e5d6a8a9-579d-4033-9123-53639ded7cdf\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.351465 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d6a8a9-579d-4033-9123-53639ded7cdf-logs" (OuterVolumeSpecName: "logs") pod "e5d6a8a9-579d-4033-9123-53639ded7cdf" (UID: "e5d6a8a9-579d-4033-9123-53639ded7cdf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.353455 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68tmj\" (UniqueName: \"kubernetes.io/projected/e5d6a8a9-579d-4033-9123-53639ded7cdf-kube-api-access-68tmj\") pod \"e5d6a8a9-579d-4033-9123-53639ded7cdf\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.353973 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-nova-metadata-tls-certs\") pod \"e5d6a8a9-579d-4033-9123-53639ded7cdf\" (UID: \"e5d6a8a9-579d-4033-9123-53639ded7cdf\") " Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.359972 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5d6a8a9-579d-4033-9123-53639ded7cdf-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.361008 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d6a8a9-579d-4033-9123-53639ded7cdf-kube-api-access-68tmj" (OuterVolumeSpecName: "kube-api-access-68tmj") pod "e5d6a8a9-579d-4033-9123-53639ded7cdf" (UID: "e5d6a8a9-579d-4033-9123-53639ded7cdf"). InnerVolumeSpecName "kube-api-access-68tmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.423086 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-config-data" (OuterVolumeSpecName: "config-data") pod "e5d6a8a9-579d-4033-9123-53639ded7cdf" (UID: "e5d6a8a9-579d-4033-9123-53639ded7cdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.427315 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5d6a8a9-579d-4033-9123-53639ded7cdf" (UID: "e5d6a8a9-579d-4033-9123-53639ded7cdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.439663 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e5d6a8a9-579d-4033-9123-53639ded7cdf" (UID: "e5d6a8a9-579d-4033-9123-53639ded7cdf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.466013 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68tmj\" (UniqueName: \"kubernetes.io/projected/e5d6a8a9-579d-4033-9123-53639ded7cdf-kube-api-access-68tmj\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.466054 4636 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.466067 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.466082 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d6a8a9-579d-4033-9123-53639ded7cdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:20 crc kubenswrapper[4636]: I1003 14:23:20.808818 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a15f16-0baf-43de-a850-b7a97bac0c65" path="/var/lib/kubelet/pods/94a15f16-0baf-43de-a850-b7a97bac0c65/volumes" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.133797 4636 generic.go:334] "Generic (PLEG): container finished" podID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerID="f68e0284fb2d29216433a2429b1d2698059756deb224a76feb6646a9c745b834" exitCode=0 Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.133858 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed7be015-ebed-48f9-b88b-4fae5001511c","Type":"ContainerDied","Data":"f68e0284fb2d29216433a2429b1d2698059756deb224a76feb6646a9c745b834"} Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.133881 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed7be015-ebed-48f9-b88b-4fae5001511c","Type":"ContainerDied","Data":"b7c2b9810070a8335c9f3e756ab07014c1ace577cfa89849c135621b60c4aacc"} Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.133890 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7c2b9810070a8335c9f3e756ab07014c1ace577cfa89849c135621b60c4aacc" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.135581 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"493bf5be-a62b-4d5e-8de8-082ab7d23842","Type":"ContainerStarted","Data":"a53aef21f7bc8ffe5b4dade010b2fcde69edfce084e06bdc711b41cfae52f8cd"} Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.135605 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"493bf5be-a62b-4d5e-8de8-082ab7d23842","Type":"ContainerStarted","Data":"d37f899874c6cfde6dad6f024d2316fe1f05e61a0249f1126f81e3beeafe700a"} Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.145669 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.145670 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5d6a8a9-579d-4033-9123-53639ded7cdf","Type":"ContainerDied","Data":"e7c15eb643fa4414edc1457ba23ee323a783aa0e9e66276b741fa7bbfff2574d"} Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.145823 4636 scope.go:117] "RemoveContainer" containerID="b8a0fb1d7b200803187cb6a9bef0cf24d19750d6c72cbfd2825c0571e08cbd85" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.161925 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.161906362 podStartE2EDuration="2.161906362s" podCreationTimestamp="2025-10-03 14:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:21.153777322 +0000 UTC m=+1351.012503579" watchObservedRunningTime="2025-10-03 14:23:21.161906362 +0000 UTC m=+1351.020632609" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.201688 4636 scope.go:117] "RemoveContainer" containerID="b2fe2bb07409d315e531d383bbce7c09ac70821f4d5009db0da4c4bed2468774" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.202621 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.220301 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.231891 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.253157 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:23:21 crc kubenswrapper[4636]: E1003 14:23:21.253528 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerName="nova-api-log" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.253544 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerName="nova-api-log" Oct 03 14:23:21 crc kubenswrapper[4636]: E1003 14:23:21.253578 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerName="nova-api-api" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.253585 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerName="nova-api-api" Oct 03 14:23:21 crc kubenswrapper[4636]: E1003 14:23:21.253596 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-log" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.253602 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-log" Oct 03 14:23:21 crc kubenswrapper[4636]: E1003 14:23:21.253610 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-metadata" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.253616 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-metadata" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.253773 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-log" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.253792 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerName="nova-api-api" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.253807 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7be015-ebed-48f9-b88b-4fae5001511c" containerName="nova-api-log" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.253818 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" containerName="nova-metadata-metadata" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.254694 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.261079 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.261134 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.274963 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.282942 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-config-data\") pod \"ed7be015-ebed-48f9-b88b-4fae5001511c\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.283041 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-internal-tls-certs\") pod \"ed7be015-ebed-48f9-b88b-4fae5001511c\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.283076 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed7be015-ebed-48f9-b88b-4fae5001511c-logs\") pod \"ed7be015-ebed-48f9-b88b-4fae5001511c\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.283248 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-public-tls-certs\") pod \"ed7be015-ebed-48f9-b88b-4fae5001511c\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.283271 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xdlq\" (UniqueName: \"kubernetes.io/projected/ed7be015-ebed-48f9-b88b-4fae5001511c-kube-api-access-6xdlq\") pod \"ed7be015-ebed-48f9-b88b-4fae5001511c\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.283297 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-combined-ca-bundle\") pod \"ed7be015-ebed-48f9-b88b-4fae5001511c\" (UID: \"ed7be015-ebed-48f9-b88b-4fae5001511c\") " Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.285030 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed7be015-ebed-48f9-b88b-4fae5001511c-logs" (OuterVolumeSpecName: "logs") pod "ed7be015-ebed-48f9-b88b-4fae5001511c" (UID: "ed7be015-ebed-48f9-b88b-4fae5001511c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.302776 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7be015-ebed-48f9-b88b-4fae5001511c-kube-api-access-6xdlq" (OuterVolumeSpecName: "kube-api-access-6xdlq") pod "ed7be015-ebed-48f9-b88b-4fae5001511c" (UID: "ed7be015-ebed-48f9-b88b-4fae5001511c"). InnerVolumeSpecName "kube-api-access-6xdlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.343465 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ed7be015-ebed-48f9-b88b-4fae5001511c" (UID: "ed7be015-ebed-48f9-b88b-4fae5001511c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.352281 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ed7be015-ebed-48f9-b88b-4fae5001511c" (UID: "ed7be015-ebed-48f9-b88b-4fae5001511c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.373738 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed7be015-ebed-48f9-b88b-4fae5001511c" (UID: "ed7be015-ebed-48f9-b88b-4fae5001511c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.375343 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-config-data" (OuterVolumeSpecName: "config-data") pod "ed7be015-ebed-48f9-b88b-4fae5001511c" (UID: "ed7be015-ebed-48f9-b88b-4fae5001511c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.385284 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c9bc86e-3770-40e9-bf37-80627278032b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.385324 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c9bc86e-3770-40e9-bf37-80627278032b-config-data\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.385378 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c9bc86e-3770-40e9-bf37-80627278032b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.385410 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c9bc86e-3770-40e9-bf37-80627278032b-logs\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.385471 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jptgn\" (UniqueName: \"kubernetes.io/projected/4c9bc86e-3770-40e9-bf37-80627278032b-kube-api-access-jptgn\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.385528 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.385541 4636 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.385551 4636 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed7be015-ebed-48f9-b88b-4fae5001511c-logs\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.385560 4636 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.385570 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xdlq\" (UniqueName: \"kubernetes.io/projected/ed7be015-ebed-48f9-b88b-4fae5001511c-kube-api-access-6xdlq\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.385578 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7be015-ebed-48f9-b88b-4fae5001511c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.486983 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c9bc86e-3770-40e9-bf37-80627278032b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.487033 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c9bc86e-3770-40e9-bf37-80627278032b-logs\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.487444 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c9bc86e-3770-40e9-bf37-80627278032b-logs\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.488385 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jptgn\" (UniqueName: \"kubernetes.io/projected/4c9bc86e-3770-40e9-bf37-80627278032b-kube-api-access-jptgn\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.488552 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c9bc86e-3770-40e9-bf37-80627278032b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.488603 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c9bc86e-3770-40e9-bf37-80627278032b-config-data\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.492162 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c9bc86e-3770-40e9-bf37-80627278032b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.492497 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c9bc86e-3770-40e9-bf37-80627278032b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.494747 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c9bc86e-3770-40e9-bf37-80627278032b-config-data\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.505210 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jptgn\" (UniqueName: \"kubernetes.io/projected/4c9bc86e-3770-40e9-bf37-80627278032b-kube-api-access-jptgn\") pod \"nova-metadata-0\" (UID: \"4c9bc86e-3770-40e9-bf37-80627278032b\") " pod="openstack/nova-metadata-0" Oct 03 14:23:21 crc kubenswrapper[4636]: I1003 14:23:21.580713 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.037775 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.159215 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c9bc86e-3770-40e9-bf37-80627278032b","Type":"ContainerStarted","Data":"606e2841e80e4f140a56ca3ce955045a7eff8121ae71b5054a1f76cd49cb1aa9"} Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.159249 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.198309 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.208453 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.229296 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.231213 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.234856 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.235143 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.235320 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.251874 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.310061 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whqbx\" (UniqueName: \"kubernetes.io/projected/8c04651e-c4ab-4322-ae46-6ee8a115ed64-kube-api-access-whqbx\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.310187 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c04651e-c4ab-4322-ae46-6ee8a115ed64-logs\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.310222 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.310309 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-config-data\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.310340 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.310358 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.421938 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.422035 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-config-data\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.422063 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.422082 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.422244 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whqbx\" (UniqueName: \"kubernetes.io/projected/8c04651e-c4ab-4322-ae46-6ee8a115ed64-kube-api-access-whqbx\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.422320 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c04651e-c4ab-4322-ae46-6ee8a115ed64-logs\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.422768 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c04651e-c4ab-4322-ae46-6ee8a115ed64-logs\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.429731 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.435771 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-config-data\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.440768 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.442638 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whqbx\" (UniqueName: \"kubernetes.io/projected/8c04651e-c4ab-4322-ae46-6ee8a115ed64-kube-api-access-whqbx\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.445986 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c04651e-c4ab-4322-ae46-6ee8a115ed64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c04651e-c4ab-4322-ae46-6ee8a115ed64\") " pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.549927 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.846941 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5d6a8a9-579d-4033-9123-53639ded7cdf" path="/var/lib/kubelet/pods/e5d6a8a9-579d-4033-9123-53639ded7cdf/volumes" Oct 03 14:23:22 crc kubenswrapper[4636]: I1003 14:23:22.851061 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7be015-ebed-48f9-b88b-4fae5001511c" path="/var/lib/kubelet/pods/ed7be015-ebed-48f9-b88b-4fae5001511c/volumes" Oct 03 14:23:23 crc kubenswrapper[4636]: I1003 14:23:23.109951 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 14:23:23 crc kubenswrapper[4636]: I1003 14:23:23.171389 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c9bc86e-3770-40e9-bf37-80627278032b","Type":"ContainerStarted","Data":"de3db8d5ddfdc9362c88754e7697724df78aa8bafc7d68398f25f2020015aea6"} Oct 03 14:23:23 crc kubenswrapper[4636]: I1003 14:23:23.171442 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c9bc86e-3770-40e9-bf37-80627278032b","Type":"ContainerStarted","Data":"22afab9ca5b2ebd20f14c646b50b17fa83799e8b39cb4c2dda70ce4ee6ec9e79"} Oct 03 14:23:23 crc kubenswrapper[4636]: I1003 14:23:23.179124 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c04651e-c4ab-4322-ae46-6ee8a115ed64","Type":"ContainerStarted","Data":"397178ac4912e0fd6a421b7b22c5554b5cb87e2638c498b8ac30b1579363fd88"} Oct 03 14:23:23 crc kubenswrapper[4636]: I1003 14:23:23.197641 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.197622433 podStartE2EDuration="2.197622433s" podCreationTimestamp="2025-10-03 14:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:23.189344439 +0000 UTC m=+1353.048070696" watchObservedRunningTime="2025-10-03 14:23:23.197622433 +0000 UTC m=+1353.056348680" Oct 03 14:23:24 crc kubenswrapper[4636]: I1003 14:23:24.189441 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c04651e-c4ab-4322-ae46-6ee8a115ed64","Type":"ContainerStarted","Data":"176c54f7a462af6c91f22e1ec274275169a8edf4bd52bd4cc0a6edc254975975"} Oct 03 14:23:24 crc kubenswrapper[4636]: I1003 14:23:24.189717 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c04651e-c4ab-4322-ae46-6ee8a115ed64","Type":"ContainerStarted","Data":"fa0e94bf0fc8fa06efe64f3e78f8ca763248adfd3d361ed1b4925898251be7f0"} Oct 03 14:23:24 crc kubenswrapper[4636]: I1003 14:23:24.217593 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.217565013 podStartE2EDuration="2.217565013s" podCreationTimestamp="2025-10-03 14:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:23:24.215212232 +0000 UTC m=+1354.073938489" watchObservedRunningTime="2025-10-03 14:23:24.217565013 +0000 UTC m=+1354.076291280" Oct 03 14:23:24 crc kubenswrapper[4636]: I1003 14:23:24.781760 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 14:23:26 crc kubenswrapper[4636]: I1003 14:23:26.582598 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:23:26 crc kubenswrapper[4636]: I1003 14:23:26.582950 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 14:23:29 crc kubenswrapper[4636]: I1003 14:23:29.782294 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 14:23:29 crc kubenswrapper[4636]: I1003 14:23:29.812345 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 14:23:30 crc kubenswrapper[4636]: I1003 14:23:30.273710 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 14:23:31 crc kubenswrapper[4636]: I1003 14:23:31.582492 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 14:23:31 crc kubenswrapper[4636]: I1003 14:23:31.583462 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 14:23:32 crc kubenswrapper[4636]: I1003 14:23:32.550828 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:23:32 crc kubenswrapper[4636]: I1003 14:23:32.555285 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 14:23:32 crc kubenswrapper[4636]: I1003 14:23:32.589834 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4c9bc86e-3770-40e9-bf37-80627278032b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:23:32 crc kubenswrapper[4636]: I1003 14:23:32.595351 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4c9bc86e-3770-40e9-bf37-80627278032b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:23:33 crc kubenswrapper[4636]: I1003 14:23:33.597268 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8c04651e-c4ab-4322-ae46-6ee8a115ed64" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:23:33 crc kubenswrapper[4636]: I1003 14:23:33.597307 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8c04651e-c4ab-4322-ae46-6ee8a115ed64" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 14:23:38 crc kubenswrapper[4636]: I1003 14:23:38.612685 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 14:23:39 crc kubenswrapper[4636]: I1003 14:23:39.162960 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:23:39 crc kubenswrapper[4636]: I1003 14:23:39.163356 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:23:39 crc kubenswrapper[4636]: I1003 14:23:39.163411 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:23:39 crc kubenswrapper[4636]: I1003 14:23:39.164292 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f35c195de607af5e2083a70ee704e67efe4c37e24910c615f6adb0ee1029e41"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:23:39 crc kubenswrapper[4636]: I1003 14:23:39.164350 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://3f35c195de607af5e2083a70ee704e67efe4c37e24910c615f6adb0ee1029e41" gracePeriod=600 Oct 03 14:23:39 crc kubenswrapper[4636]: I1003 14:23:39.328306 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="3f35c195de607af5e2083a70ee704e67efe4c37e24910c615f6adb0ee1029e41" exitCode=0 Oct 03 14:23:39 crc kubenswrapper[4636]: I1003 14:23:39.328351 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"3f35c195de607af5e2083a70ee704e67efe4c37e24910c615f6adb0ee1029e41"} Oct 03 14:23:39 crc kubenswrapper[4636]: I1003 14:23:39.328385 4636 scope.go:117] "RemoveContainer" containerID="1d353a53ac9390ffae337e3feef5ea083eb94bb2a25b7898e4f341f0e42163eb" Oct 03 14:23:40 crc kubenswrapper[4636]: I1003 14:23:40.341041 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186"} Oct 03 14:23:41 crc kubenswrapper[4636]: I1003 14:23:41.590991 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 14:23:41 crc kubenswrapper[4636]: I1003 14:23:41.596769 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 14:23:41 crc kubenswrapper[4636]: I1003 14:23:41.598944 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 14:23:42 crc kubenswrapper[4636]: I1003 14:23:42.365066 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 14:23:42 crc kubenswrapper[4636]: I1003 14:23:42.568692 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 14:23:42 crc kubenswrapper[4636]: I1003 14:23:42.569322 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 14:23:42 crc kubenswrapper[4636]: I1003 14:23:42.590738 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 14:23:42 crc kubenswrapper[4636]: I1003 14:23:42.601961 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 14:23:43 crc kubenswrapper[4636]: I1003 14:23:43.368607 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 14:23:43 crc kubenswrapper[4636]: I1003 14:23:43.374375 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 14:23:52 crc kubenswrapper[4636]: I1003 14:23:52.255478 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:23:52 crc kubenswrapper[4636]: I1003 14:23:52.944144 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:23:56 crc kubenswrapper[4636]: I1003 14:23:56.776866 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="61bd2d74-76de-402c-99af-f18ddf19610c" containerName="rabbitmq" containerID="cri-o://a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304" gracePeriod=604796 Oct 03 14:23:57 crc kubenswrapper[4636]: I1003 14:23:57.631037 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5f862438-7485-4e2c-a5b5-a6f3acf809ab" containerName="rabbitmq" containerID="cri-o://8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5" gracePeriod=604796 Oct 03 14:24:00 crc kubenswrapper[4636]: I1003 14:24:00.006794 4636 scope.go:117] "RemoveContainer" containerID="3b3b20ccfcd23fcda9b3081643d0e21ccef7acc175e1f050656c5361184fec2a" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.581202 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.584204 4636 generic.go:334] "Generic (PLEG): container finished" podID="61bd2d74-76de-402c-99af-f18ddf19610c" containerID="a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304" exitCode=0 Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.584264 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61bd2d74-76de-402c-99af-f18ddf19610c","Type":"ContainerDied","Data":"a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304"} Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.584298 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61bd2d74-76de-402c-99af-f18ddf19610c","Type":"ContainerDied","Data":"2123c5fa2a14a66e241efce58f7257e0f5dbe885fd8d2254c53a22ac88cc654e"} Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.584319 4636 scope.go:117] "RemoveContainer" containerID="a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.624057 4636 scope.go:117] "RemoveContainer" containerID="73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.660646 4636 scope.go:117] "RemoveContainer" containerID="a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304" Oct 03 14:24:03 crc kubenswrapper[4636]: E1003 14:24:03.661668 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304\": container with ID starting with a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304 not found: ID does not exist" containerID="a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.661700 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304"} err="failed to get container status \"a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304\": rpc error: code = NotFound desc = could not find container \"a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304\": container with ID starting with a1244419d73c937ed1b24b5c9360f4b695e372cfd2dc45c18e9b254746692304 not found: ID does not exist" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.661722 4636 scope.go:117] "RemoveContainer" containerID="73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8" Oct 03 14:24:03 crc kubenswrapper[4636]: E1003 14:24:03.661930 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8\": container with ID starting with 73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8 not found: ID does not exist" containerID="73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.661956 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8"} err="failed to get container status \"73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8\": rpc error: code = NotFound desc = could not find container \"73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8\": container with ID starting with 73409de03024ed94671ee96763ba393f0a99ca4a5dad0d25b9dbdad608cb9eb8 not found: ID does not exist" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.705900 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-tls\") pod \"61bd2d74-76de-402c-99af-f18ddf19610c\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.705974 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-erlang-cookie\") pod \"61bd2d74-76de-402c-99af-f18ddf19610c\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.706001 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/61bd2d74-76de-402c-99af-f18ddf19610c-pod-info\") pod \"61bd2d74-76de-402c-99af-f18ddf19610c\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.706129 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"61bd2d74-76de-402c-99af-f18ddf19610c\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.706182 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vbvn\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-kube-api-access-6vbvn\") pod \"61bd2d74-76de-402c-99af-f18ddf19610c\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.706202 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-plugins-conf\") pod \"61bd2d74-76de-402c-99af-f18ddf19610c\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.706227 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-config-data\") pod \"61bd2d74-76de-402c-99af-f18ddf19610c\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.706252 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-plugins\") pod \"61bd2d74-76de-402c-99af-f18ddf19610c\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.706267 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/61bd2d74-76de-402c-99af-f18ddf19610c-erlang-cookie-secret\") pod \"61bd2d74-76de-402c-99af-f18ddf19610c\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.706308 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-confd\") pod \"61bd2d74-76de-402c-99af-f18ddf19610c\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.706341 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-server-conf\") pod \"61bd2d74-76de-402c-99af-f18ddf19610c\" (UID: \"61bd2d74-76de-402c-99af-f18ddf19610c\") " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.708270 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "61bd2d74-76de-402c-99af-f18ddf19610c" (UID: "61bd2d74-76de-402c-99af-f18ddf19610c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.709756 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "61bd2d74-76de-402c-99af-f18ddf19610c" (UID: "61bd2d74-76de-402c-99af-f18ddf19610c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.716492 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "61bd2d74-76de-402c-99af-f18ddf19610c" (UID: "61bd2d74-76de-402c-99af-f18ddf19610c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.719977 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/61bd2d74-76de-402c-99af-f18ddf19610c-pod-info" (OuterVolumeSpecName: "pod-info") pod "61bd2d74-76de-402c-99af-f18ddf19610c" (UID: "61bd2d74-76de-402c-99af-f18ddf19610c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.723531 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "61bd2d74-76de-402c-99af-f18ddf19610c" (UID: "61bd2d74-76de-402c-99af-f18ddf19610c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.725062 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-kube-api-access-6vbvn" (OuterVolumeSpecName: "kube-api-access-6vbvn") pod "61bd2d74-76de-402c-99af-f18ddf19610c" (UID: "61bd2d74-76de-402c-99af-f18ddf19610c"). InnerVolumeSpecName "kube-api-access-6vbvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.727641 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61bd2d74-76de-402c-99af-f18ddf19610c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "61bd2d74-76de-402c-99af-f18ddf19610c" (UID: "61bd2d74-76de-402c-99af-f18ddf19610c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.742257 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "61bd2d74-76de-402c-99af-f18ddf19610c" (UID: "61bd2d74-76de-402c-99af-f18ddf19610c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.808965 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vbvn\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-kube-api-access-6vbvn\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.809005 4636 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.809017 4636 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.809028 4636 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/61bd2d74-76de-402c-99af-f18ddf19610c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.809040 4636 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.809051 4636 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.809061 4636 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/61bd2d74-76de-402c-99af-f18ddf19610c-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.809086 4636 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.837686 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-config-data" (OuterVolumeSpecName: "config-data") pod "61bd2d74-76de-402c-99af-f18ddf19610c" (UID: "61bd2d74-76de-402c-99af-f18ddf19610c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.866656 4636 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.886808 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-server-conf" (OuterVolumeSpecName: "server-conf") pod "61bd2d74-76de-402c-99af-f18ddf19610c" (UID: "61bd2d74-76de-402c-99af-f18ddf19610c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.910881 4636 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.910919 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.910931 4636 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/61bd2d74-76de-402c-99af-f18ddf19610c-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:03 crc kubenswrapper[4636]: I1003 14:24:03.960563 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "61bd2d74-76de-402c-99af-f18ddf19610c" (UID: "61bd2d74-76de-402c-99af-f18ddf19610c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.012315 4636 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/61bd2d74-76de-402c-99af-f18ddf19610c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.135493 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.217872 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-config-data\") pod \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.217966 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2p6h\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-kube-api-access-j2p6h\") pod \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.218008 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-erlang-cookie\") pod \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.218044 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-tls\") pod \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.218069 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-plugins\") pod \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.218109 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f862438-7485-4e2c-a5b5-a6f3acf809ab-pod-info\") pod \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.218168 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-confd\") pod \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.218201 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.218220 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f862438-7485-4e2c-a5b5-a6f3acf809ab-erlang-cookie-secret\") pod \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.218337 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-server-conf\") pod \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.218389 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-plugins-conf\") pod \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\" (UID: \"5f862438-7485-4e2c-a5b5-a6f3acf809ab\") " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.224442 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5f862438-7485-4e2c-a5b5-a6f3acf809ab" (UID: "5f862438-7485-4e2c-a5b5-a6f3acf809ab"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.228499 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5f862438-7485-4e2c-a5b5-a6f3acf809ab" (UID: "5f862438-7485-4e2c-a5b5-a6f3acf809ab"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.228926 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "5f862438-7485-4e2c-a5b5-a6f3acf809ab" (UID: "5f862438-7485-4e2c-a5b5-a6f3acf809ab"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.229055 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5f862438-7485-4e2c-a5b5-a6f3acf809ab" (UID: "5f862438-7485-4e2c-a5b5-a6f3acf809ab"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.233273 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f862438-7485-4e2c-a5b5-a6f3acf809ab-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5f862438-7485-4e2c-a5b5-a6f3acf809ab" (UID: "5f862438-7485-4e2c-a5b5-a6f3acf809ab"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.238029 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5f862438-7485-4e2c-a5b5-a6f3acf809ab" (UID: "5f862438-7485-4e2c-a5b5-a6f3acf809ab"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.243902 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-kube-api-access-j2p6h" (OuterVolumeSpecName: "kube-api-access-j2p6h") pod "5f862438-7485-4e2c-a5b5-a6f3acf809ab" (UID: "5f862438-7485-4e2c-a5b5-a6f3acf809ab"). InnerVolumeSpecName "kube-api-access-j2p6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.255253 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5f862438-7485-4e2c-a5b5-a6f3acf809ab-pod-info" (OuterVolumeSpecName: "pod-info") pod "5f862438-7485-4e2c-a5b5-a6f3acf809ab" (UID: "5f862438-7485-4e2c-a5b5-a6f3acf809ab"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.301841 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-config-data" (OuterVolumeSpecName: "config-data") pod "5f862438-7485-4e2c-a5b5-a6f3acf809ab" (UID: "5f862438-7485-4e2c-a5b5-a6f3acf809ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.320286 4636 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.320318 4636 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f862438-7485-4e2c-a5b5-a6f3acf809ab-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.320329 4636 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.320338 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.320349 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2p6h\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-kube-api-access-j2p6h\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.320358 4636 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.320367 4636 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.320376 4636 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.320384 4636 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f862438-7485-4e2c-a5b5-a6f3acf809ab-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.350864 4636 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.371980 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-server-conf" (OuterVolumeSpecName: "server-conf") pod "5f862438-7485-4e2c-a5b5-a6f3acf809ab" (UID: "5f862438-7485-4e2c-a5b5-a6f3acf809ab"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.424199 4636 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f862438-7485-4e2c-a5b5-a6f3acf809ab-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.424235 4636 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.447029 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5f862438-7485-4e2c-a5b5-a6f3acf809ab" (UID: "5f862438-7485-4e2c-a5b5-a6f3acf809ab"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.526345 4636 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f862438-7485-4e2c-a5b5-a6f3acf809ab-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.594051 4636 generic.go:334] "Generic (PLEG): container finished" podID="5f862438-7485-4e2c-a5b5-a6f3acf809ab" containerID="8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5" exitCode=0 Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.594117 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.594109 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f862438-7485-4e2c-a5b5-a6f3acf809ab","Type":"ContainerDied","Data":"8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5"} Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.594180 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f862438-7485-4e2c-a5b5-a6f3acf809ab","Type":"ContainerDied","Data":"3eeb83d2eabc49e32d04e8d6e7edf54b059f95f9ad21d540f4b633d992db172f"} Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.594200 4636 scope.go:117] "RemoveContainer" containerID="8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.595483 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.633258 4636 scope.go:117] "RemoveContainer" containerID="01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.639387 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.664236 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.691784 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.702810 4636 scope.go:117] "RemoveContainer" containerID="8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5" Oct 03 14:24:04 crc kubenswrapper[4636]: E1003 14:24:04.704538 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5\": container with ID starting with 8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5 not found: ID does not exist" containerID="8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.704577 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5"} err="failed to get container status \"8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5\": rpc error: code = NotFound desc = could not find container \"8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5\": container with ID starting with 8d563ad166e37c6168c6f9ce5f3e2a2155b1b86d8a1584f09d19152a381e7fd5 not found: ID does not exist" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.704604 4636 scope.go:117] "RemoveContainer" containerID="01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af" Oct 03 14:24:04 crc kubenswrapper[4636]: E1003 14:24:04.704959 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af\": container with ID starting with 01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af not found: ID does not exist" containerID="01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.704988 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af"} err="failed to get container status \"01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af\": rpc error: code = NotFound desc = could not find container \"01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af\": container with ID starting with 01ffb247be0a30764551cb82b336ce02fedb37cd673ba1df60771c5e5fa407af not found: ID does not exist" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.735707 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.755563 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:24:04 crc kubenswrapper[4636]: E1003 14:24:04.755922 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f862438-7485-4e2c-a5b5-a6f3acf809ab" containerName="rabbitmq" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.755936 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f862438-7485-4e2c-a5b5-a6f3acf809ab" containerName="rabbitmq" Oct 03 14:24:04 crc kubenswrapper[4636]: E1003 14:24:04.755953 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bd2d74-76de-402c-99af-f18ddf19610c" containerName="setup-container" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.755959 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bd2d74-76de-402c-99af-f18ddf19610c" containerName="setup-container" Oct 03 14:24:04 crc kubenswrapper[4636]: E1003 14:24:04.755983 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bd2d74-76de-402c-99af-f18ddf19610c" containerName="rabbitmq" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.755989 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bd2d74-76de-402c-99af-f18ddf19610c" containerName="rabbitmq" Oct 03 14:24:04 crc kubenswrapper[4636]: E1003 14:24:04.756001 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f862438-7485-4e2c-a5b5-a6f3acf809ab" containerName="setup-container" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.756007 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f862438-7485-4e2c-a5b5-a6f3acf809ab" containerName="setup-container" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.756190 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f862438-7485-4e2c-a5b5-a6f3acf809ab" containerName="rabbitmq" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.756209 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="61bd2d74-76de-402c-99af-f18ddf19610c" containerName="rabbitmq" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.757066 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.760555 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.761005 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.761247 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.761377 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.761507 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.761634 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-xnwcw" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.780074 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.838233 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.838596 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.838850 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.839207 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.839324 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.839447 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-config-data\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.839736 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.839879 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h49b8\" (UniqueName: \"kubernetes.io/projected/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-kube-api-access-h49b8\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.840767 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.840811 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.840909 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.859525 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f862438-7485-4e2c-a5b5-a6f3acf809ab" path="/var/lib/kubelet/pods/5f862438-7485-4e2c-a5b5-a6f3acf809ab/volumes" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.861171 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61bd2d74-76de-402c-99af-f18ddf19610c" path="/var/lib/kubelet/pods/61bd2d74-76de-402c-99af-f18ddf19610c/volumes" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.863481 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.865007 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.867935 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.868504 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.868666 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hgc9d" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.868860 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.869028 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.869210 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.869304 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.869555 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.882093 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.943451 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.943516 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.943546 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.943598 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.943626 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.943652 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-config-data\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.943710 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.943756 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h49b8\" (UniqueName: \"kubernetes.io/projected/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-kube-api-access-h49b8\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.943802 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.943826 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.943852 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.945347 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.945947 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.946343 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.946550 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-config-data\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.946882 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.950242 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.950948 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.953729 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.953755 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.962709 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h49b8\" (UniqueName: \"kubernetes.io/projected/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-kube-api-access-h49b8\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.965160 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f7c3cb64-6553-4d95-8ccc-25f758b3cc97-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:04 crc kubenswrapper[4636]: I1003 14:24:04.982025 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f7c3cb64-6553-4d95-8ccc-25f758b3cc97\") " pod="openstack/rabbitmq-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.045421 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.045470 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.045508 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.045538 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.045710 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e97eeb5a-f169-4c58-bda2-c727ca1f5126-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.045787 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e97eeb5a-f169-4c58-bda2-c727ca1f5126-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.045889 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.046006 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e97eeb5a-f169-4c58-bda2-c727ca1f5126-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.046073 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e97eeb5a-f169-4c58-bda2-c727ca1f5126-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.046147 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e97eeb5a-f169-4c58-bda2-c727ca1f5126-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.046200 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7t8n\" (UniqueName: \"kubernetes.io/projected/e97eeb5a-f169-4c58-bda2-c727ca1f5126-kube-api-access-p7t8n\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.104741 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.148645 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.148982 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e97eeb5a-f169-4c58-bda2-c727ca1f5126-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.149036 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e97eeb5a-f169-4c58-bda2-c727ca1f5126-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.149063 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e97eeb5a-f169-4c58-bda2-c727ca1f5126-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.149116 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7t8n\" (UniqueName: \"kubernetes.io/projected/e97eeb5a-f169-4c58-bda2-c727ca1f5126-kube-api-access-p7t8n\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.149171 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.149210 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.149231 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.149253 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.149320 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e97eeb5a-f169-4c58-bda2-c727ca1f5126-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.149362 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e97eeb5a-f169-4c58-bda2-c727ca1f5126-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.149639 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.150057 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.150603 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e97eeb5a-f169-4c58-bda2-c727ca1f5126-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.150639 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e97eeb5a-f169-4c58-bda2-c727ca1f5126-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.150647 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.150920 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e97eeb5a-f169-4c58-bda2-c727ca1f5126-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.153839 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.154668 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e97eeb5a-f169-4c58-bda2-c727ca1f5126-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.155370 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e97eeb5a-f169-4c58-bda2-c727ca1f5126-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.155805 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e97eeb5a-f169-4c58-bda2-c727ca1f5126-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.173803 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7t8n\" (UniqueName: \"kubernetes.io/projected/e97eeb5a-f169-4c58-bda2-c727ca1f5126-kube-api-access-p7t8n\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.197414 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e97eeb5a-f169-4c58-bda2-c727ca1f5126\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.492704 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.652680 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 14:24:05 crc kubenswrapper[4636]: W1003 14:24:05.661951 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7c3cb64_6553_4d95_8ccc_25f758b3cc97.slice/crio-ed285c073096b77daa3593253cc37db8b6d2e43063cf04bd65b52488542c01b5 WatchSource:0}: Error finding container ed285c073096b77daa3593253cc37db8b6d2e43063cf04bd65b52488542c01b5: Status 404 returned error can't find the container with id ed285c073096b77daa3593253cc37db8b6d2e43063cf04bd65b52488542c01b5 Oct 03 14:24:05 crc kubenswrapper[4636]: I1003 14:24:05.817270 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.448356 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-qtblw"] Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.450806 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.453807 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.466177 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-qtblw"] Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.510622 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.510691 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-config\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.510751 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.510844 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.510890 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.510948 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.510985 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2lr7\" (UniqueName: \"kubernetes.io/projected/72110982-08c6-4d46-bdd0-cb7c5076a252-kube-api-access-s2lr7\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.612422 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2lr7\" (UniqueName: \"kubernetes.io/projected/72110982-08c6-4d46-bdd0-cb7c5076a252-kube-api-access-s2lr7\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.612497 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.612611 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-config\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.612668 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.612719 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.612743 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.612776 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.613718 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.613821 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-config\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.613822 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.613833 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.613905 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.614120 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.631456 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f7c3cb64-6553-4d95-8ccc-25f758b3cc97","Type":"ContainerStarted","Data":"ed285c073096b77daa3593253cc37db8b6d2e43063cf04bd65b52488542c01b5"} Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.632932 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e97eeb5a-f169-4c58-bda2-c727ca1f5126","Type":"ContainerStarted","Data":"ac12255ebde3a0c416a1c373ecd543cd592a0a37e551545945101eded4863f87"} Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.636866 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2lr7\" (UniqueName: \"kubernetes.io/projected/72110982-08c6-4d46-bdd0-cb7c5076a252-kube-api-access-s2lr7\") pod \"dnsmasq-dns-79bd4cc8c9-qtblw\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:06 crc kubenswrapper[4636]: I1003 14:24:06.817409 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:07 crc kubenswrapper[4636]: I1003 14:24:07.344836 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-qtblw"] Oct 03 14:24:07 crc kubenswrapper[4636]: I1003 14:24:07.642551 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" event={"ID":"72110982-08c6-4d46-bdd0-cb7c5076a252","Type":"ContainerStarted","Data":"951200092ab72b1ede31c935fa4df2631cbea412b672bae94eab3ebfbe934de4"} Oct 03 14:24:07 crc kubenswrapper[4636]: I1003 14:24:07.644971 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f7c3cb64-6553-4d95-8ccc-25f758b3cc97","Type":"ContainerStarted","Data":"35d9aff2b0dd94141b4abe4effca61d01263a6ec389f6d59009b7857254ced7e"} Oct 03 14:24:07 crc kubenswrapper[4636]: I1003 14:24:07.646837 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e97eeb5a-f169-4c58-bda2-c727ca1f5126","Type":"ContainerStarted","Data":"1310a34ae63b3e412bc0c551b9d7606124d737992ed0096e34ec2a67154c047a"} Oct 03 14:24:08 crc kubenswrapper[4636]: I1003 14:24:08.656641 4636 generic.go:334] "Generic (PLEG): container finished" podID="72110982-08c6-4d46-bdd0-cb7c5076a252" containerID="3098872bbce09671aa8adaa4ed298fd0d7193a75396eb3866d19fb8042e6e945" exitCode=0 Oct 03 14:24:08 crc kubenswrapper[4636]: I1003 14:24:08.656735 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" event={"ID":"72110982-08c6-4d46-bdd0-cb7c5076a252","Type":"ContainerDied","Data":"3098872bbce09671aa8adaa4ed298fd0d7193a75396eb3866d19fb8042e6e945"} Oct 03 14:24:09 crc kubenswrapper[4636]: I1003 14:24:09.693804 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" event={"ID":"72110982-08c6-4d46-bdd0-cb7c5076a252","Type":"ContainerStarted","Data":"b79907ce8b77d5e3f9da6449b148d5a350d9d4a723867a9f81aa27fa80626301"} Oct 03 14:24:09 crc kubenswrapper[4636]: I1003 14:24:09.694399 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:09 crc kubenswrapper[4636]: I1003 14:24:09.733191 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" podStartSLOduration=3.733170692 podStartE2EDuration="3.733170692s" podCreationTimestamp="2025-10-03 14:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:24:09.725729047 +0000 UTC m=+1399.584455314" watchObservedRunningTime="2025-10-03 14:24:09.733170692 +0000 UTC m=+1399.591896939" Oct 03 14:24:16 crc kubenswrapper[4636]: I1003 14:24:16.819350 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:16 crc kubenswrapper[4636]: I1003 14:24:16.908257 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s7jpb"] Oct 03 14:24:16 crc kubenswrapper[4636]: I1003 14:24:16.908509 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" podUID="72cdceb9-a893-4565-aa03-d1cbdf9550ae" containerName="dnsmasq-dns" containerID="cri-o://8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8" gracePeriod=10 Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.095867 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d7677974f-dtkft"] Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.098112 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.131013 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d7677974f-dtkft"] Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.139280 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-ovsdbserver-nb\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.139624 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-dns-svc\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.139776 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-config\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.139884 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-dns-swift-storage-0\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.139960 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksf76\" (UniqueName: \"kubernetes.io/projected/317017e9-687f-4a84-b896-fab84c269e2b-kube-api-access-ksf76\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.140124 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-ovsdbserver-sb\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.140379 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-openstack-edpm-ipam\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.242221 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-ovsdbserver-sb\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.243280 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-openstack-edpm-ipam\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.243288 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-ovsdbserver-sb\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.242314 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-openstack-edpm-ipam\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.243556 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-ovsdbserver-nb\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.243607 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-dns-svc\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.244234 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-ovsdbserver-nb\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.244396 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-dns-svc\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.244867 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-config\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.244961 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-dns-swift-storage-0\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.245025 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksf76\" (UniqueName: \"kubernetes.io/projected/317017e9-687f-4a84-b896-fab84c269e2b-kube-api-access-ksf76\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.245592 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-dns-swift-storage-0\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.245652 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317017e9-687f-4a84-b896-fab84c269e2b-config\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.271348 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksf76\" (UniqueName: \"kubernetes.io/projected/317017e9-687f-4a84-b896-fab84c269e2b-kube-api-access-ksf76\") pod \"dnsmasq-dns-d7677974f-dtkft\" (UID: \"317017e9-687f-4a84-b896-fab84c269e2b\") " pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.423092 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.621863 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.760477 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-svc\") pod \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.760859 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-config\") pod \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.760958 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-sb\") pod \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.760975 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-nb\") pod \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.761017 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-swift-storage-0\") pod \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.761057 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rllh\" (UniqueName: \"kubernetes.io/projected/72cdceb9-a893-4565-aa03-d1cbdf9550ae-kube-api-access-7rllh\") pod \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\" (UID: \"72cdceb9-a893-4565-aa03-d1cbdf9550ae\") " Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.822361 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72cdceb9-a893-4565-aa03-d1cbdf9550ae-kube-api-access-7rllh" (OuterVolumeSpecName: "kube-api-access-7rllh") pod "72cdceb9-a893-4565-aa03-d1cbdf9550ae" (UID: "72cdceb9-a893-4565-aa03-d1cbdf9550ae"). InnerVolumeSpecName "kube-api-access-7rllh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.824627 4636 generic.go:334] "Generic (PLEG): container finished" podID="72cdceb9-a893-4565-aa03-d1cbdf9550ae" containerID="8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8" exitCode=0 Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.824672 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" event={"ID":"72cdceb9-a893-4565-aa03-d1cbdf9550ae","Type":"ContainerDied","Data":"8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8"} Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.824704 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" event={"ID":"72cdceb9-a893-4565-aa03-d1cbdf9550ae","Type":"ContainerDied","Data":"566f9ecaf5fcd3a0acf3ff1db8fddce2549ed53d3366d5adcff2f06c9a0ea11a"} Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.824719 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.824723 4636 scope.go:117] "RemoveContainer" containerID="8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.879972 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rllh\" (UniqueName: \"kubernetes.io/projected/72cdceb9-a893-4565-aa03-d1cbdf9550ae-kube-api-access-7rllh\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.886587 4636 scope.go:117] "RemoveContainer" containerID="54d14b88b8b1875e78386e36d5121c6a0131ee8495352237b79ce9afbcbccd33" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.905857 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-config" (OuterVolumeSpecName: "config") pod "72cdceb9-a893-4565-aa03-d1cbdf9550ae" (UID: "72cdceb9-a893-4565-aa03-d1cbdf9550ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.907528 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "72cdceb9-a893-4565-aa03-d1cbdf9550ae" (UID: "72cdceb9-a893-4565-aa03-d1cbdf9550ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.944349 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d7677974f-dtkft"] Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.955875 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "72cdceb9-a893-4565-aa03-d1cbdf9550ae" (UID: "72cdceb9-a893-4565-aa03-d1cbdf9550ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.964986 4636 scope.go:117] "RemoveContainer" containerID="8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8" Oct 03 14:24:17 crc kubenswrapper[4636]: E1003 14:24:17.965783 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8\": container with ID starting with 8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8 not found: ID does not exist" containerID="8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.965819 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8"} err="failed to get container status \"8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8\": rpc error: code = NotFound desc = could not find container \"8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8\": container with ID starting with 8e03415c3cfded96380599ccbaabaccdedda9e63a0562d18a2f6268af39c4ef8 not found: ID does not exist" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.965841 4636 scope.go:117] "RemoveContainer" containerID="54d14b88b8b1875e78386e36d5121c6a0131ee8495352237b79ce9afbcbccd33" Oct 03 14:24:17 crc kubenswrapper[4636]: E1003 14:24:17.968528 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d14b88b8b1875e78386e36d5121c6a0131ee8495352237b79ce9afbcbccd33\": container with ID starting with 54d14b88b8b1875e78386e36d5121c6a0131ee8495352237b79ce9afbcbccd33 not found: ID does not exist" containerID="54d14b88b8b1875e78386e36d5121c6a0131ee8495352237b79ce9afbcbccd33" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.968557 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d14b88b8b1875e78386e36d5121c6a0131ee8495352237b79ce9afbcbccd33"} err="failed to get container status \"54d14b88b8b1875e78386e36d5121c6a0131ee8495352237b79ce9afbcbccd33\": rpc error: code = NotFound desc = could not find container \"54d14b88b8b1875e78386e36d5121c6a0131ee8495352237b79ce9afbcbccd33\": container with ID starting with 54d14b88b8b1875e78386e36d5121c6a0131ee8495352237b79ce9afbcbccd33 not found: ID does not exist" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.980013 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "72cdceb9-a893-4565-aa03-d1cbdf9550ae" (UID: "72cdceb9-a893-4565-aa03-d1cbdf9550ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.982736 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.982894 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.982967 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:17 crc kubenswrapper[4636]: I1003 14:24:17.983020 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:18 crc kubenswrapper[4636]: I1003 14:24:18.011687 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "72cdceb9-a893-4565-aa03-d1cbdf9550ae" (UID: "72cdceb9-a893-4565-aa03-d1cbdf9550ae"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:18 crc kubenswrapper[4636]: I1003 14:24:18.084912 4636 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72cdceb9-a893-4565-aa03-d1cbdf9550ae-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:18 crc kubenswrapper[4636]: I1003 14:24:18.199826 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s7jpb"] Oct 03 14:24:18 crc kubenswrapper[4636]: I1003 14:24:18.210142 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s7jpb"] Oct 03 14:24:18 crc kubenswrapper[4636]: I1003 14:24:18.803791 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72cdceb9-a893-4565-aa03-d1cbdf9550ae" path="/var/lib/kubelet/pods/72cdceb9-a893-4565-aa03-d1cbdf9550ae/volumes" Oct 03 14:24:18 crc kubenswrapper[4636]: I1003 14:24:18.835799 4636 generic.go:334] "Generic (PLEG): container finished" podID="317017e9-687f-4a84-b896-fab84c269e2b" containerID="2c2f582665500e787c99a0a9097dbac250fe2cab4d4fed2d0b0d3c211d19ea35" exitCode=0 Oct 03 14:24:18 crc kubenswrapper[4636]: I1003 14:24:18.835875 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d7677974f-dtkft" event={"ID":"317017e9-687f-4a84-b896-fab84c269e2b","Type":"ContainerDied","Data":"2c2f582665500e787c99a0a9097dbac250fe2cab4d4fed2d0b0d3c211d19ea35"} Oct 03 14:24:18 crc kubenswrapper[4636]: I1003 14:24:18.835912 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d7677974f-dtkft" event={"ID":"317017e9-687f-4a84-b896-fab84c269e2b","Type":"ContainerStarted","Data":"811694a666e1e868a51fd5fbe63613126e0dfb576ef46619f56d524e06122bed"} Oct 03 14:24:19 crc kubenswrapper[4636]: I1003 14:24:19.852743 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d7677974f-dtkft" event={"ID":"317017e9-687f-4a84-b896-fab84c269e2b","Type":"ContainerStarted","Data":"2a0de9aa6b6fa8a849eb041ffda04234996d41c46b6617a75ca9bf757585f66f"} Oct 03 14:24:19 crc kubenswrapper[4636]: I1003 14:24:19.853004 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:19 crc kubenswrapper[4636]: I1003 14:24:19.877604 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d7677974f-dtkft" podStartSLOduration=2.8775860399999997 podStartE2EDuration="2.87758604s" podCreationTimestamp="2025-10-03 14:24:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:24:19.87718788 +0000 UTC m=+1409.735914137" watchObservedRunningTime="2025-10-03 14:24:19.87758604 +0000 UTC m=+1409.736312287" Oct 03 14:24:22 crc kubenswrapper[4636]: I1003 14:24:22.487117 4636 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-89c5cd4d5-s7jpb" podUID="72cdceb9-a893-4565-aa03-d1cbdf9550ae" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.200:5353: i/o timeout" Oct 03 14:24:27 crc kubenswrapper[4636]: I1003 14:24:27.424274 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d7677974f-dtkft" Oct 03 14:24:27 crc kubenswrapper[4636]: I1003 14:24:27.502027 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-qtblw"] Oct 03 14:24:27 crc kubenswrapper[4636]: I1003 14:24:27.502275 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" podUID="72110982-08c6-4d46-bdd0-cb7c5076a252" containerName="dnsmasq-dns" containerID="cri-o://b79907ce8b77d5e3f9da6449b148d5a350d9d4a723867a9f81aa27fa80626301" gracePeriod=10 Oct 03 14:24:27 crc kubenswrapper[4636]: I1003 14:24:27.928214 4636 generic.go:334] "Generic (PLEG): container finished" podID="72110982-08c6-4d46-bdd0-cb7c5076a252" containerID="b79907ce8b77d5e3f9da6449b148d5a350d9d4a723867a9f81aa27fa80626301" exitCode=0 Oct 03 14:24:27 crc kubenswrapper[4636]: I1003 14:24:27.928507 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" event={"ID":"72110982-08c6-4d46-bdd0-cb7c5076a252","Type":"ContainerDied","Data":"b79907ce8b77d5e3f9da6449b148d5a350d9d4a723867a9f81aa27fa80626301"} Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.069723 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.185990 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-swift-storage-0\") pod \"72110982-08c6-4d46-bdd0-cb7c5076a252\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.186076 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2lr7\" (UniqueName: \"kubernetes.io/projected/72110982-08c6-4d46-bdd0-cb7c5076a252-kube-api-access-s2lr7\") pod \"72110982-08c6-4d46-bdd0-cb7c5076a252\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.186167 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-nb\") pod \"72110982-08c6-4d46-bdd0-cb7c5076a252\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.186202 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-openstack-edpm-ipam\") pod \"72110982-08c6-4d46-bdd0-cb7c5076a252\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.186292 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-sb\") pod \"72110982-08c6-4d46-bdd0-cb7c5076a252\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.186320 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-config\") pod \"72110982-08c6-4d46-bdd0-cb7c5076a252\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.186431 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-svc\") pod \"72110982-08c6-4d46-bdd0-cb7c5076a252\" (UID: \"72110982-08c6-4d46-bdd0-cb7c5076a252\") " Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.236047 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72110982-08c6-4d46-bdd0-cb7c5076a252-kube-api-access-s2lr7" (OuterVolumeSpecName: "kube-api-access-s2lr7") pod "72110982-08c6-4d46-bdd0-cb7c5076a252" (UID: "72110982-08c6-4d46-bdd0-cb7c5076a252"). InnerVolumeSpecName "kube-api-access-s2lr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.262703 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "72110982-08c6-4d46-bdd0-cb7c5076a252" (UID: "72110982-08c6-4d46-bdd0-cb7c5076a252"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.269325 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "72110982-08c6-4d46-bdd0-cb7c5076a252" (UID: "72110982-08c6-4d46-bdd0-cb7c5076a252"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.279141 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "72110982-08c6-4d46-bdd0-cb7c5076a252" (UID: "72110982-08c6-4d46-bdd0-cb7c5076a252"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.279403 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "72110982-08c6-4d46-bdd0-cb7c5076a252" (UID: "72110982-08c6-4d46-bdd0-cb7c5076a252"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.288479 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.288518 4636 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.288533 4636 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.288546 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2lr7\" (UniqueName: \"kubernetes.io/projected/72110982-08c6-4d46-bdd0-cb7c5076a252-kube-api-access-s2lr7\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.288562 4636 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.291769 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-config" (OuterVolumeSpecName: "config") pod "72110982-08c6-4d46-bdd0-cb7c5076a252" (UID: "72110982-08c6-4d46-bdd0-cb7c5076a252"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.307748 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "72110982-08c6-4d46-bdd0-cb7c5076a252" (UID: "72110982-08c6-4d46-bdd0-cb7c5076a252"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.390541 4636 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.390579 4636 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72110982-08c6-4d46-bdd0-cb7c5076a252-config\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.939797 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" event={"ID":"72110982-08c6-4d46-bdd0-cb7c5076a252","Type":"ContainerDied","Data":"951200092ab72b1ede31c935fa4df2631cbea412b672bae94eab3ebfbe934de4"} Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.940213 4636 scope.go:117] "RemoveContainer" containerID="b79907ce8b77d5e3f9da6449b148d5a350d9d4a723867a9f81aa27fa80626301" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.940070 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-qtblw" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.972155 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-qtblw"] Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.981915 4636 scope.go:117] "RemoveContainer" containerID="3098872bbce09671aa8adaa4ed298fd0d7193a75396eb3866d19fb8042e6e945" Oct 03 14:24:28 crc kubenswrapper[4636]: I1003 14:24:28.987133 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-qtblw"] Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.118203 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l6glb"] Oct 03 14:24:30 crc kubenswrapper[4636]: E1003 14:24:30.118616 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72110982-08c6-4d46-bdd0-cb7c5076a252" containerName="dnsmasq-dns" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.118628 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="72110982-08c6-4d46-bdd0-cb7c5076a252" containerName="dnsmasq-dns" Oct 03 14:24:30 crc kubenswrapper[4636]: E1003 14:24:30.118637 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72cdceb9-a893-4565-aa03-d1cbdf9550ae" containerName="dnsmasq-dns" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.118644 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="72cdceb9-a893-4565-aa03-d1cbdf9550ae" containerName="dnsmasq-dns" Oct 03 14:24:30 crc kubenswrapper[4636]: E1003 14:24:30.118670 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72cdceb9-a893-4565-aa03-d1cbdf9550ae" containerName="init" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.118676 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="72cdceb9-a893-4565-aa03-d1cbdf9550ae" containerName="init" Oct 03 14:24:30 crc kubenswrapper[4636]: E1003 14:24:30.118684 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72110982-08c6-4d46-bdd0-cb7c5076a252" containerName="init" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.118690 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="72110982-08c6-4d46-bdd0-cb7c5076a252" containerName="init" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.118871 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="72110982-08c6-4d46-bdd0-cb7c5076a252" containerName="dnsmasq-dns" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.118881 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="72cdceb9-a893-4565-aa03-d1cbdf9550ae" containerName="dnsmasq-dns" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.120559 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.142353 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l6glb"] Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.221365 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-utilities\") pod \"redhat-marketplace-l6glb\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.221564 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-catalog-content\") pod \"redhat-marketplace-l6glb\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.224427 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnfht\" (UniqueName: \"kubernetes.io/projected/c24e4026-9fcf-4b65-9977-8d4f8e801b97-kube-api-access-nnfht\") pod \"redhat-marketplace-l6glb\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.325629 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnfht\" (UniqueName: \"kubernetes.io/projected/c24e4026-9fcf-4b65-9977-8d4f8e801b97-kube-api-access-nnfht\") pod \"redhat-marketplace-l6glb\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.325686 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-utilities\") pod \"redhat-marketplace-l6glb\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.325807 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-catalog-content\") pod \"redhat-marketplace-l6glb\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.326360 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-utilities\") pod \"redhat-marketplace-l6glb\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.326532 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-catalog-content\") pod \"redhat-marketplace-l6glb\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.346150 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnfht\" (UniqueName: \"kubernetes.io/projected/c24e4026-9fcf-4b65-9977-8d4f8e801b97-kube-api-access-nnfht\") pod \"redhat-marketplace-l6glb\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.486556 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.805095 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72110982-08c6-4d46-bdd0-cb7c5076a252" path="/var/lib/kubelet/pods/72110982-08c6-4d46-bdd0-cb7c5076a252/volumes" Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.940232 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l6glb"] Oct 03 14:24:30 crc kubenswrapper[4636]: I1003 14:24:30.959079 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6glb" event={"ID":"c24e4026-9fcf-4b65-9977-8d4f8e801b97","Type":"ContainerStarted","Data":"c72637c710ea64c2cfafe3a5f8a8f34dd77fb8d25e52848ba30c40a3df0f6697"} Oct 03 14:24:31 crc kubenswrapper[4636]: I1003 14:24:31.968558 4636 generic.go:334] "Generic (PLEG): container finished" podID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" containerID="052c56d1e0415f8d64054447c195ceea7083add98ae249f2f2db4ce2be04f5f8" exitCode=0 Oct 03 14:24:31 crc kubenswrapper[4636]: I1003 14:24:31.968872 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6glb" event={"ID":"c24e4026-9fcf-4b65-9977-8d4f8e801b97","Type":"ContainerDied","Data":"052c56d1e0415f8d64054447c195ceea7083add98ae249f2f2db4ce2be04f5f8"} Oct 03 14:24:33 crc kubenswrapper[4636]: I1003 14:24:33.991265 4636 generic.go:334] "Generic (PLEG): container finished" podID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" containerID="9d01d81cc001d3a276cf3318f8eef46700e53187f0e609f5d36cd0cffbabfab8" exitCode=0 Oct 03 14:24:33 crc kubenswrapper[4636]: I1003 14:24:33.991336 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6glb" event={"ID":"c24e4026-9fcf-4b65-9977-8d4f8e801b97","Type":"ContainerDied","Data":"9d01d81cc001d3a276cf3318f8eef46700e53187f0e609f5d36cd0cffbabfab8"} Oct 03 14:24:35 crc kubenswrapper[4636]: I1003 14:24:35.001456 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6glb" event={"ID":"c24e4026-9fcf-4b65-9977-8d4f8e801b97","Type":"ContainerStarted","Data":"56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09"} Oct 03 14:24:35 crc kubenswrapper[4636]: I1003 14:24:35.019621 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l6glb" podStartSLOduration=2.616304911 podStartE2EDuration="5.019606313s" podCreationTimestamp="2025-10-03 14:24:30 +0000 UTC" firstStartedPulling="2025-10-03 14:24:31.97058676 +0000 UTC m=+1421.829313007" lastFinishedPulling="2025-10-03 14:24:34.373888162 +0000 UTC m=+1424.232614409" observedRunningTime="2025-10-03 14:24:35.01834418 +0000 UTC m=+1424.877070437" watchObservedRunningTime="2025-10-03 14:24:35.019606313 +0000 UTC m=+1424.878332560" Oct 03 14:24:39 crc kubenswrapper[4636]: I1003 14:24:39.033158 4636 generic.go:334] "Generic (PLEG): container finished" podID="f7c3cb64-6553-4d95-8ccc-25f758b3cc97" containerID="35d9aff2b0dd94141b4abe4effca61d01263a6ec389f6d59009b7857254ced7e" exitCode=0 Oct 03 14:24:39 crc kubenswrapper[4636]: I1003 14:24:39.033203 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f7c3cb64-6553-4d95-8ccc-25f758b3cc97","Type":"ContainerDied","Data":"35d9aff2b0dd94141b4abe4effca61d01263a6ec389f6d59009b7857254ced7e"} Oct 03 14:24:40 crc kubenswrapper[4636]: I1003 14:24:40.043907 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f7c3cb64-6553-4d95-8ccc-25f758b3cc97","Type":"ContainerStarted","Data":"55bc38577727f7e2129fab4c1d81931d91ec464d1a0a95917790247496c4a2dd"} Oct 03 14:24:40 crc kubenswrapper[4636]: I1003 14:24:40.044687 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 14:24:40 crc kubenswrapper[4636]: I1003 14:24:40.045920 4636 generic.go:334] "Generic (PLEG): container finished" podID="e97eeb5a-f169-4c58-bda2-c727ca1f5126" containerID="1310a34ae63b3e412bc0c551b9d7606124d737992ed0096e34ec2a67154c047a" exitCode=0 Oct 03 14:24:40 crc kubenswrapper[4636]: I1003 14:24:40.045950 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e97eeb5a-f169-4c58-bda2-c727ca1f5126","Type":"ContainerDied","Data":"1310a34ae63b3e412bc0c551b9d7606124d737992ed0096e34ec2a67154c047a"} Oct 03 14:24:40 crc kubenswrapper[4636]: I1003 14:24:40.114209 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.114187166 podStartE2EDuration="36.114187166s" podCreationTimestamp="2025-10-03 14:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:24:40.10592497 +0000 UTC m=+1429.964651217" watchObservedRunningTime="2025-10-03 14:24:40.114187166 +0000 UTC m=+1429.972913413" Oct 03 14:24:40 crc kubenswrapper[4636]: I1003 14:24:40.487229 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:40 crc kubenswrapper[4636]: I1003 14:24:40.487396 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:40 crc kubenswrapper[4636]: I1003 14:24:40.544634 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:41 crc kubenswrapper[4636]: I1003 14:24:41.056623 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e97eeb5a-f169-4c58-bda2-c727ca1f5126","Type":"ContainerStarted","Data":"76869f414fd78e24767b3c48eece3e2edc39a4b13ba8badd3bb3abd8f43542e9"} Oct 03 14:24:41 crc kubenswrapper[4636]: I1003 14:24:41.057401 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:41 crc kubenswrapper[4636]: I1003 14:24:41.089895 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.089874374 podStartE2EDuration="37.089874374s" podCreationTimestamp="2025-10-03 14:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 14:24:41.085308345 +0000 UTC m=+1430.944034602" watchObservedRunningTime="2025-10-03 14:24:41.089874374 +0000 UTC m=+1430.948600621" Oct 03 14:24:41 crc kubenswrapper[4636]: I1003 14:24:41.105407 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:41 crc kubenswrapper[4636]: I1003 14:24:41.166086 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l6glb"] Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.075849 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l6glb" podUID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" containerName="registry-server" containerID="cri-o://56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09" gracePeriod=2 Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.195909 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zqtsl"] Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.199696 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.252314 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zqtsl"] Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.275751 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-catalog-content\") pod \"certified-operators-zqtsl\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.275819 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc745\" (UniqueName: \"kubernetes.io/projected/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-kube-api-access-cc745\") pod \"certified-operators-zqtsl\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.275853 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-utilities\") pod \"certified-operators-zqtsl\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.379073 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-catalog-content\") pod \"certified-operators-zqtsl\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.379457 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc745\" (UniqueName: \"kubernetes.io/projected/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-kube-api-access-cc745\") pod \"certified-operators-zqtsl\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.379498 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-utilities\") pod \"certified-operators-zqtsl\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.380208 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-utilities\") pod \"certified-operators-zqtsl\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.380525 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-catalog-content\") pod \"certified-operators-zqtsl\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.413305 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc745\" (UniqueName: \"kubernetes.io/projected/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-kube-api-access-cc745\") pod \"certified-operators-zqtsl\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.541488 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.629835 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.685329 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-utilities\") pod \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.685413 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-catalog-content\") pod \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.685505 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnfht\" (UniqueName: \"kubernetes.io/projected/c24e4026-9fcf-4b65-9977-8d4f8e801b97-kube-api-access-nnfht\") pod \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\" (UID: \"c24e4026-9fcf-4b65-9977-8d4f8e801b97\") " Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.691350 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24e4026-9fcf-4b65-9977-8d4f8e801b97-kube-api-access-nnfht" (OuterVolumeSpecName: "kube-api-access-nnfht") pod "c24e4026-9fcf-4b65-9977-8d4f8e801b97" (UID: "c24e4026-9fcf-4b65-9977-8d4f8e801b97"). InnerVolumeSpecName "kube-api-access-nnfht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.692792 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-utilities" (OuterVolumeSpecName: "utilities") pod "c24e4026-9fcf-4b65-9977-8d4f8e801b97" (UID: "c24e4026-9fcf-4b65-9977-8d4f8e801b97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.707274 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c24e4026-9fcf-4b65-9977-8d4f8e801b97" (UID: "c24e4026-9fcf-4b65-9977-8d4f8e801b97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.787814 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.787854 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24e4026-9fcf-4b65-9977-8d4f8e801b97-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:43 crc kubenswrapper[4636]: I1003 14:24:43.787871 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnfht\" (UniqueName: \"kubernetes.io/projected/c24e4026-9fcf-4b65-9977-8d4f8e801b97-kube-api-access-nnfht\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.087855 4636 generic.go:334] "Generic (PLEG): container finished" podID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" containerID="56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09" exitCode=0 Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.087909 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6glb" event={"ID":"c24e4026-9fcf-4b65-9977-8d4f8e801b97","Type":"ContainerDied","Data":"56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09"} Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.087940 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l6glb" event={"ID":"c24e4026-9fcf-4b65-9977-8d4f8e801b97","Type":"ContainerDied","Data":"c72637c710ea64c2cfafe3a5f8a8f34dd77fb8d25e52848ba30c40a3df0f6697"} Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.087961 4636 scope.go:117] "RemoveContainer" containerID="56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09" Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.088147 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l6glb" Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.153408 4636 scope.go:117] "RemoveContainer" containerID="9d01d81cc001d3a276cf3318f8eef46700e53187f0e609f5d36cd0cffbabfab8" Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.153478 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l6glb"] Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.209088 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l6glb"] Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.221946 4636 scope.go:117] "RemoveContainer" containerID="052c56d1e0415f8d64054447c195ceea7083add98ae249f2f2db4ce2be04f5f8" Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.263621 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zqtsl"] Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.303833 4636 scope.go:117] "RemoveContainer" containerID="56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09" Oct 03 14:24:44 crc kubenswrapper[4636]: E1003 14:24:44.305283 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09\": container with ID starting with 56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09 not found: ID does not exist" containerID="56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09" Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.305317 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09"} err="failed to get container status \"56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09\": rpc error: code = NotFound desc = could not find container \"56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09\": container with ID starting with 56337535f20371ef3d2fc647359e903b2084633e09e83cc6bdd3f9f59f28fe09 not found: ID does not exist" Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.305342 4636 scope.go:117] "RemoveContainer" containerID="9d01d81cc001d3a276cf3318f8eef46700e53187f0e609f5d36cd0cffbabfab8" Oct 03 14:24:44 crc kubenswrapper[4636]: E1003 14:24:44.308443 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d01d81cc001d3a276cf3318f8eef46700e53187f0e609f5d36cd0cffbabfab8\": container with ID starting with 9d01d81cc001d3a276cf3318f8eef46700e53187f0e609f5d36cd0cffbabfab8 not found: ID does not exist" containerID="9d01d81cc001d3a276cf3318f8eef46700e53187f0e609f5d36cd0cffbabfab8" Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.308478 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d01d81cc001d3a276cf3318f8eef46700e53187f0e609f5d36cd0cffbabfab8"} err="failed to get container status \"9d01d81cc001d3a276cf3318f8eef46700e53187f0e609f5d36cd0cffbabfab8\": rpc error: code = NotFound desc = could not find container \"9d01d81cc001d3a276cf3318f8eef46700e53187f0e609f5d36cd0cffbabfab8\": container with ID starting with 9d01d81cc001d3a276cf3318f8eef46700e53187f0e609f5d36cd0cffbabfab8 not found: ID does not exist" Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.308497 4636 scope.go:117] "RemoveContainer" containerID="052c56d1e0415f8d64054447c195ceea7083add98ae249f2f2db4ce2be04f5f8" Oct 03 14:24:44 crc kubenswrapper[4636]: E1003 14:24:44.311644 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"052c56d1e0415f8d64054447c195ceea7083add98ae249f2f2db4ce2be04f5f8\": container with ID starting with 052c56d1e0415f8d64054447c195ceea7083add98ae249f2f2db4ce2be04f5f8 not found: ID does not exist" containerID="052c56d1e0415f8d64054447c195ceea7083add98ae249f2f2db4ce2be04f5f8" Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.311675 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"052c56d1e0415f8d64054447c195ceea7083add98ae249f2f2db4ce2be04f5f8"} err="failed to get container status \"052c56d1e0415f8d64054447c195ceea7083add98ae249f2f2db4ce2be04f5f8\": rpc error: code = NotFound desc = could not find container \"052c56d1e0415f8d64054447c195ceea7083add98ae249f2f2db4ce2be04f5f8\": container with ID starting with 052c56d1e0415f8d64054447c195ceea7083add98ae249f2f2db4ce2be04f5f8 not found: ID does not exist" Oct 03 14:24:44 crc kubenswrapper[4636]: I1003 14:24:44.807918 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" path="/var/lib/kubelet/pods/c24e4026-9fcf-4b65-9977-8d4f8e801b97/volumes" Oct 03 14:24:45 crc kubenswrapper[4636]: I1003 14:24:45.098191 4636 generic.go:334] "Generic (PLEG): container finished" podID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" containerID="abc99bfd099bc1c7e9b533dbd805ee694298ba52f28c820b0faa88fc89d3ee26" exitCode=0 Oct 03 14:24:45 crc kubenswrapper[4636]: I1003 14:24:45.098289 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqtsl" event={"ID":"2fbb3a21-476c-4c39-a42b-1f0846a53a2a","Type":"ContainerDied","Data":"abc99bfd099bc1c7e9b533dbd805ee694298ba52f28c820b0faa88fc89d3ee26"} Oct 03 14:24:45 crc kubenswrapper[4636]: I1003 14:24:45.098321 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqtsl" event={"ID":"2fbb3a21-476c-4c39-a42b-1f0846a53a2a","Type":"ContainerStarted","Data":"8c87f0a13d8d16cadd7ccd434d207d574c7ef497b510391bd849bbdca041e5f7"} Oct 03 14:24:47 crc kubenswrapper[4636]: I1003 14:24:47.117506 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqtsl" event={"ID":"2fbb3a21-476c-4c39-a42b-1f0846a53a2a","Type":"ContainerStarted","Data":"f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd"} Oct 03 14:24:48 crc kubenswrapper[4636]: I1003 14:24:48.130077 4636 generic.go:334] "Generic (PLEG): container finished" podID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" containerID="f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd" exitCode=0 Oct 03 14:24:48 crc kubenswrapper[4636]: I1003 14:24:48.130165 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqtsl" event={"ID":"2fbb3a21-476c-4c39-a42b-1f0846a53a2a","Type":"ContainerDied","Data":"f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd"} Oct 03 14:24:49 crc kubenswrapper[4636]: I1003 14:24:49.141074 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqtsl" event={"ID":"2fbb3a21-476c-4c39-a42b-1f0846a53a2a","Type":"ContainerStarted","Data":"4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070"} Oct 03 14:24:49 crc kubenswrapper[4636]: I1003 14:24:49.166281 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zqtsl" podStartSLOduration=2.638011691 podStartE2EDuration="6.166227844s" podCreationTimestamp="2025-10-03 14:24:43 +0000 UTC" firstStartedPulling="2025-10-03 14:24:45.100296303 +0000 UTC m=+1434.959022550" lastFinishedPulling="2025-10-03 14:24:48.628512456 +0000 UTC m=+1438.487238703" observedRunningTime="2025-10-03 14:24:49.15919126 +0000 UTC m=+1439.017917507" watchObservedRunningTime="2025-10-03 14:24:49.166227844 +0000 UTC m=+1439.024954091" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.059382 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw"] Oct 03 14:24:51 crc kubenswrapper[4636]: E1003 14:24:51.061424 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" containerName="extract-content" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.061546 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" containerName="extract-content" Oct 03 14:24:51 crc kubenswrapper[4636]: E1003 14:24:51.061644 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" containerName="registry-server" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.061701 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" containerName="registry-server" Oct 03 14:24:51 crc kubenswrapper[4636]: E1003 14:24:51.061766 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" containerName="extract-utilities" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.061818 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" containerName="extract-utilities" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.062058 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e4026-9fcf-4b65-9977-8d4f8e801b97" containerName="registry-server" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.062973 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.067760 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.068596 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.072202 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.081544 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw"] Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.082289 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.234204 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.234335 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cngvm\" (UniqueName: \"kubernetes.io/projected/9eb85b02-3bf8-4fe8-a060-c3593e995499-kube-api-access-cngvm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.234509 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.234590 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.337225 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.337318 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cngvm\" (UniqueName: \"kubernetes.io/projected/9eb85b02-3bf8-4fe8-a060-c3593e995499-kube-api-access-cngvm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.337431 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.337478 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.343647 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.347839 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.363883 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.384128 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cngvm\" (UniqueName: \"kubernetes.io/projected/9eb85b02-3bf8-4fe8-a060-c3593e995499-kube-api-access-cngvm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:51 crc kubenswrapper[4636]: I1003 14:24:51.390258 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:24:52 crc kubenswrapper[4636]: I1003 14:24:52.381391 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw"] Oct 03 14:24:52 crc kubenswrapper[4636]: W1003 14:24:52.384932 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9eb85b02_3bf8_4fe8_a060_c3593e995499.slice/crio-1d6e616052ecbd74b84ffe37a688d5464c6d0c3c92f2cc52f02dc6904c9a5a0d WatchSource:0}: Error finding container 1d6e616052ecbd74b84ffe37a688d5464c6d0c3c92f2cc52f02dc6904c9a5a0d: Status 404 returned error can't find the container with id 1d6e616052ecbd74b84ffe37a688d5464c6d0c3c92f2cc52f02dc6904c9a5a0d Oct 03 14:24:53 crc kubenswrapper[4636]: I1003 14:24:53.175917 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" event={"ID":"9eb85b02-3bf8-4fe8-a060-c3593e995499","Type":"ContainerStarted","Data":"1d6e616052ecbd74b84ffe37a688d5464c6d0c3c92f2cc52f02dc6904c9a5a0d"} Oct 03 14:24:53 crc kubenswrapper[4636]: I1003 14:24:53.542456 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:53 crc kubenswrapper[4636]: I1003 14:24:53.542532 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:53 crc kubenswrapper[4636]: I1003 14:24:53.599811 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:54 crc kubenswrapper[4636]: I1003 14:24:54.233872 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:54 crc kubenswrapper[4636]: I1003 14:24:54.281854 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zqtsl"] Oct 03 14:24:55 crc kubenswrapper[4636]: I1003 14:24:55.108307 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 14:24:55 crc kubenswrapper[4636]: I1003 14:24:55.496369 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 14:24:56 crc kubenswrapper[4636]: I1003 14:24:56.225601 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zqtsl" podUID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" containerName="registry-server" containerID="cri-o://4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070" gracePeriod=2 Oct 03 14:24:56 crc kubenswrapper[4636]: I1003 14:24:56.763833 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:56 crc kubenswrapper[4636]: I1003 14:24:56.875571 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-catalog-content\") pod \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " Oct 03 14:24:56 crc kubenswrapper[4636]: I1003 14:24:56.875631 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc745\" (UniqueName: \"kubernetes.io/projected/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-kube-api-access-cc745\") pod \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " Oct 03 14:24:56 crc kubenswrapper[4636]: I1003 14:24:56.875722 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-utilities\") pod \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\" (UID: \"2fbb3a21-476c-4c39-a42b-1f0846a53a2a\") " Oct 03 14:24:56 crc kubenswrapper[4636]: I1003 14:24:56.876943 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-utilities" (OuterVolumeSpecName: "utilities") pod "2fbb3a21-476c-4c39-a42b-1f0846a53a2a" (UID: "2fbb3a21-476c-4c39-a42b-1f0846a53a2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:24:56 crc kubenswrapper[4636]: I1003 14:24:56.881527 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:56 crc kubenswrapper[4636]: I1003 14:24:56.899023 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-kube-api-access-cc745" (OuterVolumeSpecName: "kube-api-access-cc745") pod "2fbb3a21-476c-4c39-a42b-1f0846a53a2a" (UID: "2fbb3a21-476c-4c39-a42b-1f0846a53a2a"). InnerVolumeSpecName "kube-api-access-cc745". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:24:56 crc kubenswrapper[4636]: I1003 14:24:56.943754 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fbb3a21-476c-4c39-a42b-1f0846a53a2a" (UID: "2fbb3a21-476c-4c39-a42b-1f0846a53a2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:24:56 crc kubenswrapper[4636]: I1003 14:24:56.982689 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:56 crc kubenswrapper[4636]: I1003 14:24:56.982740 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc745\" (UniqueName: \"kubernetes.io/projected/2fbb3a21-476c-4c39-a42b-1f0846a53a2a-kube-api-access-cc745\") on node \"crc\" DevicePath \"\"" Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.255009 4636 generic.go:334] "Generic (PLEG): container finished" podID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" containerID="4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070" exitCode=0 Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.255059 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqtsl" event={"ID":"2fbb3a21-476c-4c39-a42b-1f0846a53a2a","Type":"ContainerDied","Data":"4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070"} Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.255090 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqtsl" Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.255120 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqtsl" event={"ID":"2fbb3a21-476c-4c39-a42b-1f0846a53a2a","Type":"ContainerDied","Data":"8c87f0a13d8d16cadd7ccd434d207d574c7ef497b510391bd849bbdca041e5f7"} Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.255144 4636 scope.go:117] "RemoveContainer" containerID="4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070" Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.295607 4636 scope.go:117] "RemoveContainer" containerID="f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd" Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.335957 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zqtsl"] Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.341055 4636 scope.go:117] "RemoveContainer" containerID="abc99bfd099bc1c7e9b533dbd805ee694298ba52f28c820b0faa88fc89d3ee26" Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.347519 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zqtsl"] Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.378776 4636 scope.go:117] "RemoveContainer" containerID="4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070" Oct 03 14:24:57 crc kubenswrapper[4636]: E1003 14:24:57.379195 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070\": container with ID starting with 4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070 not found: ID does not exist" containerID="4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070" Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.379248 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070"} err="failed to get container status \"4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070\": rpc error: code = NotFound desc = could not find container \"4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070\": container with ID starting with 4e02062bd0778bfeb0bd1e47c75ee3c51a0e31030057a841fc7b92b6efcb7070 not found: ID does not exist" Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.379281 4636 scope.go:117] "RemoveContainer" containerID="f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd" Oct 03 14:24:57 crc kubenswrapper[4636]: E1003 14:24:57.379705 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd\": container with ID starting with f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd not found: ID does not exist" containerID="f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd" Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.379749 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd"} err="failed to get container status \"f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd\": rpc error: code = NotFound desc = could not find container \"f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd\": container with ID starting with f7555811d82b63ada07aaa78ff32d0dc8d7cc5fb0decb93b8d0c5464115b71bd not found: ID does not exist" Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.379779 4636 scope.go:117] "RemoveContainer" containerID="abc99bfd099bc1c7e9b533dbd805ee694298ba52f28c820b0faa88fc89d3ee26" Oct 03 14:24:57 crc kubenswrapper[4636]: E1003 14:24:57.382561 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc99bfd099bc1c7e9b533dbd805ee694298ba52f28c820b0faa88fc89d3ee26\": container with ID starting with abc99bfd099bc1c7e9b533dbd805ee694298ba52f28c820b0faa88fc89d3ee26 not found: ID does not exist" containerID="abc99bfd099bc1c7e9b533dbd805ee694298ba52f28c820b0faa88fc89d3ee26" Oct 03 14:24:57 crc kubenswrapper[4636]: I1003 14:24:57.382606 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc99bfd099bc1c7e9b533dbd805ee694298ba52f28c820b0faa88fc89d3ee26"} err="failed to get container status \"abc99bfd099bc1c7e9b533dbd805ee694298ba52f28c820b0faa88fc89d3ee26\": rpc error: code = NotFound desc = could not find container \"abc99bfd099bc1c7e9b533dbd805ee694298ba52f28c820b0faa88fc89d3ee26\": container with ID starting with abc99bfd099bc1c7e9b533dbd805ee694298ba52f28c820b0faa88fc89d3ee26 not found: ID does not exist" Oct 03 14:24:58 crc kubenswrapper[4636]: I1003 14:24:58.805222 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" path="/var/lib/kubelet/pods/2fbb3a21-476c-4c39-a42b-1f0846a53a2a/volumes" Oct 03 14:25:00 crc kubenswrapper[4636]: I1003 14:25:00.169124 4636 scope.go:117] "RemoveContainer" containerID="7cc5445780c2bc60b13e6c4386f884c0cc76b440675901fee80ec24a5ab19745" Oct 03 14:25:08 crc kubenswrapper[4636]: E1003 14:25:08.781584 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Oct 03 14:25:08 crc kubenswrapper[4636]: E1003 14:25:08.782030 4636 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 03 14:25:08 crc kubenswrapper[4636]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Oct 03 14:25:08 crc kubenswrapper[4636]: - hosts: all Oct 03 14:25:08 crc kubenswrapper[4636]: strategy: linear Oct 03 14:25:08 crc kubenswrapper[4636]: tasks: Oct 03 14:25:08 crc kubenswrapper[4636]: - name: Enable podified-repos Oct 03 14:25:08 crc kubenswrapper[4636]: become: true Oct 03 14:25:08 crc kubenswrapper[4636]: ansible.builtin.shell: | Oct 03 14:25:08 crc kubenswrapper[4636]: set -euxo pipefail Oct 03 14:25:08 crc kubenswrapper[4636]: pushd /var/tmp Oct 03 14:25:08 crc kubenswrapper[4636]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Oct 03 14:25:08 crc kubenswrapper[4636]: pushd repo-setup-main Oct 03 14:25:08 crc kubenswrapper[4636]: python3 -m venv ./venv Oct 03 14:25:08 crc kubenswrapper[4636]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Oct 03 14:25:08 crc kubenswrapper[4636]: ./venv/bin/repo-setup current-podified -b antelope Oct 03 14:25:08 crc kubenswrapper[4636]: popd Oct 03 14:25:08 crc kubenswrapper[4636]: rm -rf repo-setup-main Oct 03 14:25:08 crc kubenswrapper[4636]: Oct 03 14:25:08 crc kubenswrapper[4636]: Oct 03 14:25:08 crc kubenswrapper[4636]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Oct 03 14:25:08 crc kubenswrapper[4636]: edpm_override_hosts: openstack-edpm-ipam Oct 03 14:25:08 crc kubenswrapper[4636]: edpm_service_type: repo-setup Oct 03 14:25:08 crc kubenswrapper[4636]: Oct 03 14:25:08 crc kubenswrapper[4636]: Oct 03 14:25:08 crc kubenswrapper[4636]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/runner/env/ssh_key,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cngvm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw_openstack(9eb85b02-3bf8-4fe8-a060-c3593e995499): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Oct 03 14:25:08 crc kubenswrapper[4636]: > logger="UnhandledError" Oct 03 14:25:08 crc kubenswrapper[4636]: E1003 14:25:08.783091 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" podUID="9eb85b02-3bf8-4fe8-a060-c3593e995499" Oct 03 14:25:09 crc kubenswrapper[4636]: E1003 14:25:09.367865 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" podUID="9eb85b02-3bf8-4fe8-a060-c3593e995499" Oct 03 14:25:24 crc kubenswrapper[4636]: I1003 14:25:24.511491 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" event={"ID":"9eb85b02-3bf8-4fe8-a060-c3593e995499","Type":"ContainerStarted","Data":"6664fb4d33777a22f3b23f221081d4c25fe5d5723674d4badccd9a64b3893b24"} Oct 03 14:25:24 crc kubenswrapper[4636]: I1003 14:25:24.544544 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" podStartSLOduration=2.6055901759999998 podStartE2EDuration="33.544526953s" podCreationTimestamp="2025-10-03 14:24:51 +0000 UTC" firstStartedPulling="2025-10-03 14:24:52.387005758 +0000 UTC m=+1442.245732005" lastFinishedPulling="2025-10-03 14:25:23.325942535 +0000 UTC m=+1473.184668782" observedRunningTime="2025-10-03 14:25:24.535749794 +0000 UTC m=+1474.394476041" watchObservedRunningTime="2025-10-03 14:25:24.544526953 +0000 UTC m=+1474.403253200" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.040596 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-srl5v"] Oct 03 14:25:32 crc kubenswrapper[4636]: E1003 14:25:32.041328 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" containerName="extract-content" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.041340 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" containerName="extract-content" Oct 03 14:25:32 crc kubenswrapper[4636]: E1003 14:25:32.041354 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" containerName="registry-server" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.041361 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" containerName="registry-server" Oct 03 14:25:32 crc kubenswrapper[4636]: E1003 14:25:32.041378 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" containerName="extract-utilities" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.041384 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" containerName="extract-utilities" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.041597 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fbb3a21-476c-4c39-a42b-1f0846a53a2a" containerName="registry-server" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.042919 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.060707 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srl5v"] Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.083657 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-utilities\") pod \"community-operators-srl5v\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.083721 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-catalog-content\") pod \"community-operators-srl5v\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.083766 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkfmh\" (UniqueName: \"kubernetes.io/projected/dd4d5068-5bad-4992-bd16-07055acb5f19-kube-api-access-hkfmh\") pod \"community-operators-srl5v\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.185260 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-utilities\") pod \"community-operators-srl5v\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.185305 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-catalog-content\") pod \"community-operators-srl5v\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.185347 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkfmh\" (UniqueName: \"kubernetes.io/projected/dd4d5068-5bad-4992-bd16-07055acb5f19-kube-api-access-hkfmh\") pod \"community-operators-srl5v\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.186339 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-utilities\") pod \"community-operators-srl5v\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.186580 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-catalog-content\") pod \"community-operators-srl5v\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.205142 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkfmh\" (UniqueName: \"kubernetes.io/projected/dd4d5068-5bad-4992-bd16-07055acb5f19-kube-api-access-hkfmh\") pod \"community-operators-srl5v\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.363730 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:32 crc kubenswrapper[4636]: I1003 14:25:32.850181 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srl5v"] Oct 03 14:25:33 crc kubenswrapper[4636]: I1003 14:25:33.588251 4636 generic.go:334] "Generic (PLEG): container finished" podID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerID="ed32930e9d81b2c339d5f8f6e23c5fb7fbc5226a360e29c2e7b49cab2934827f" exitCode=0 Oct 03 14:25:33 crc kubenswrapper[4636]: I1003 14:25:33.588439 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srl5v" event={"ID":"dd4d5068-5bad-4992-bd16-07055acb5f19","Type":"ContainerDied","Data":"ed32930e9d81b2c339d5f8f6e23c5fb7fbc5226a360e29c2e7b49cab2934827f"} Oct 03 14:25:33 crc kubenswrapper[4636]: I1003 14:25:33.588595 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srl5v" event={"ID":"dd4d5068-5bad-4992-bd16-07055acb5f19","Type":"ContainerStarted","Data":"97a10d1dee09a4e5389d20d1ef15563988020cf5838f2d59e2ad3fbb0f053ad3"} Oct 03 14:25:36 crc kubenswrapper[4636]: I1003 14:25:36.628455 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srl5v" event={"ID":"dd4d5068-5bad-4992-bd16-07055acb5f19","Type":"ContainerStarted","Data":"975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14"} Oct 03 14:25:38 crc kubenswrapper[4636]: I1003 14:25:38.650725 4636 generic.go:334] "Generic (PLEG): container finished" podID="9eb85b02-3bf8-4fe8-a060-c3593e995499" containerID="6664fb4d33777a22f3b23f221081d4c25fe5d5723674d4badccd9a64b3893b24" exitCode=0 Oct 03 14:25:38 crc kubenswrapper[4636]: I1003 14:25:38.650807 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" event={"ID":"9eb85b02-3bf8-4fe8-a060-c3593e995499","Type":"ContainerDied","Data":"6664fb4d33777a22f3b23f221081d4c25fe5d5723674d4badccd9a64b3893b24"} Oct 03 14:25:39 crc kubenswrapper[4636]: I1003 14:25:39.162591 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:25:39 crc kubenswrapper[4636]: I1003 14:25:39.162646 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:25:39 crc kubenswrapper[4636]: I1003 14:25:39.667611 4636 generic.go:334] "Generic (PLEG): container finished" podID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerID="975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14" exitCode=0 Oct 03 14:25:39 crc kubenswrapper[4636]: I1003 14:25:39.667679 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srl5v" event={"ID":"dd4d5068-5bad-4992-bd16-07055acb5f19","Type":"ContainerDied","Data":"975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14"} Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.154901 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.340819 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cngvm\" (UniqueName: \"kubernetes.io/projected/9eb85b02-3bf8-4fe8-a060-c3593e995499-kube-api-access-cngvm\") pod \"9eb85b02-3bf8-4fe8-a060-c3593e995499\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.340917 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-ssh-key\") pod \"9eb85b02-3bf8-4fe8-a060-c3593e995499\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.340964 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-inventory\") pod \"9eb85b02-3bf8-4fe8-a060-c3593e995499\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.341042 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-repo-setup-combined-ca-bundle\") pod \"9eb85b02-3bf8-4fe8-a060-c3593e995499\" (UID: \"9eb85b02-3bf8-4fe8-a060-c3593e995499\") " Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.346700 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb85b02-3bf8-4fe8-a060-c3593e995499-kube-api-access-cngvm" (OuterVolumeSpecName: "kube-api-access-cngvm") pod "9eb85b02-3bf8-4fe8-a060-c3593e995499" (UID: "9eb85b02-3bf8-4fe8-a060-c3593e995499"). InnerVolumeSpecName "kube-api-access-cngvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.352813 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9eb85b02-3bf8-4fe8-a060-c3593e995499" (UID: "9eb85b02-3bf8-4fe8-a060-c3593e995499"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.370883 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-inventory" (OuterVolumeSpecName: "inventory") pod "9eb85b02-3bf8-4fe8-a060-c3593e995499" (UID: "9eb85b02-3bf8-4fe8-a060-c3593e995499"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.372847 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9eb85b02-3bf8-4fe8-a060-c3593e995499" (UID: "9eb85b02-3bf8-4fe8-a060-c3593e995499"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.443817 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.443859 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.443873 4636 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb85b02-3bf8-4fe8-a060-c3593e995499-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.443888 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cngvm\" (UniqueName: \"kubernetes.io/projected/9eb85b02-3bf8-4fe8-a060-c3593e995499-kube-api-access-cngvm\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.676278 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" event={"ID":"9eb85b02-3bf8-4fe8-a060-c3593e995499","Type":"ContainerDied","Data":"1d6e616052ecbd74b84ffe37a688d5464c6d0c3c92f2cc52f02dc6904c9a5a0d"} Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.676315 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d6e616052ecbd74b84ffe37a688d5464c6d0c3c92f2cc52f02dc6904c9a5a0d" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.676313 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.678203 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srl5v" event={"ID":"dd4d5068-5bad-4992-bd16-07055acb5f19","Type":"ContainerStarted","Data":"db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c"} Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.707499 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-srl5v" podStartSLOduration=2.16234172 podStartE2EDuration="8.707484594s" podCreationTimestamp="2025-10-03 14:25:32 +0000 UTC" firstStartedPulling="2025-10-03 14:25:33.590169226 +0000 UTC m=+1483.448895473" lastFinishedPulling="2025-10-03 14:25:40.1353121 +0000 UTC m=+1489.994038347" observedRunningTime="2025-10-03 14:25:40.699381333 +0000 UTC m=+1490.558107580" watchObservedRunningTime="2025-10-03 14:25:40.707484594 +0000 UTC m=+1490.566210841" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.783704 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h"] Oct 03 14:25:40 crc kubenswrapper[4636]: E1003 14:25:40.784332 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb85b02-3bf8-4fe8-a060-c3593e995499" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.784350 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb85b02-3bf8-4fe8-a060-c3593e995499" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.784577 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb85b02-3bf8-4fe8-a060-c3593e995499" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.785446 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.789538 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.789708 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.789871 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.790447 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.829348 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h"] Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.852924 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4vk2h\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.853013 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4vk2h\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.853364 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9246\" (UniqueName: \"kubernetes.io/projected/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-kube-api-access-q9246\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4vk2h\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.954279 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4vk2h\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.954457 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9246\" (UniqueName: \"kubernetes.io/projected/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-kube-api-access-q9246\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4vk2h\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.954550 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4vk2h\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.960848 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4vk2h\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.961143 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4vk2h\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:40 crc kubenswrapper[4636]: I1003 14:25:40.971930 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9246\" (UniqueName: \"kubernetes.io/projected/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-kube-api-access-q9246\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4vk2h\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:41 crc kubenswrapper[4636]: I1003 14:25:41.107043 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:41 crc kubenswrapper[4636]: I1003 14:25:41.637202 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h"] Oct 03 14:25:41 crc kubenswrapper[4636]: I1003 14:25:41.691669 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" event={"ID":"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a","Type":"ContainerStarted","Data":"e3b17876bd6fab12be1c480e05cf30aedaa86e46954f9f43fd2ef6bf8184d628"} Oct 03 14:25:42 crc kubenswrapper[4636]: I1003 14:25:42.364438 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:42 crc kubenswrapper[4636]: I1003 14:25:42.364738 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:43 crc kubenswrapper[4636]: I1003 14:25:43.426334 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-srl5v" podUID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerName="registry-server" probeResult="failure" output=< Oct 03 14:25:43 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 14:25:43 crc kubenswrapper[4636]: > Oct 03 14:25:43 crc kubenswrapper[4636]: I1003 14:25:43.714240 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" event={"ID":"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a","Type":"ContainerStarted","Data":"6d92f3612afcb728cf0cb21f37d8ef6aadeaa557c99ba72b3f47a11e6096873e"} Oct 03 14:25:43 crc kubenswrapper[4636]: I1003 14:25:43.737864 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" podStartSLOduration=2.546184722 podStartE2EDuration="3.737843556s" podCreationTimestamp="2025-10-03 14:25:40 +0000 UTC" firstStartedPulling="2025-10-03 14:25:41.651228685 +0000 UTC m=+1491.509954932" lastFinishedPulling="2025-10-03 14:25:42.842887519 +0000 UTC m=+1492.701613766" observedRunningTime="2025-10-03 14:25:43.728330329 +0000 UTC m=+1493.587056596" watchObservedRunningTime="2025-10-03 14:25:43.737843556 +0000 UTC m=+1493.596569803" Oct 03 14:25:48 crc kubenswrapper[4636]: I1003 14:25:48.763540 4636 generic.go:334] "Generic (PLEG): container finished" podID="e7ae7cb3-1588-4c70-92e2-942cef9d9b0a" containerID="6d92f3612afcb728cf0cb21f37d8ef6aadeaa557c99ba72b3f47a11e6096873e" exitCode=0 Oct 03 14:25:48 crc kubenswrapper[4636]: I1003 14:25:48.763636 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" event={"ID":"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a","Type":"ContainerDied","Data":"6d92f3612afcb728cf0cb21f37d8ef6aadeaa557c99ba72b3f47a11e6096873e"} Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.178509 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.344467 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9246\" (UniqueName: \"kubernetes.io/projected/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-kube-api-access-q9246\") pod \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.344703 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-inventory\") pod \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.344752 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-ssh-key\") pod \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\" (UID: \"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a\") " Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.351247 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-kube-api-access-q9246" (OuterVolumeSpecName: "kube-api-access-q9246") pod "e7ae7cb3-1588-4c70-92e2-942cef9d9b0a" (UID: "e7ae7cb3-1588-4c70-92e2-942cef9d9b0a"). InnerVolumeSpecName "kube-api-access-q9246". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.374898 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-inventory" (OuterVolumeSpecName: "inventory") pod "e7ae7cb3-1588-4c70-92e2-942cef9d9b0a" (UID: "e7ae7cb3-1588-4c70-92e2-942cef9d9b0a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.377903 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e7ae7cb3-1588-4c70-92e2-942cef9d9b0a" (UID: "e7ae7cb3-1588-4c70-92e2-942cef9d9b0a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.448570 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9246\" (UniqueName: \"kubernetes.io/projected/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-kube-api-access-q9246\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.448626 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.448640 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7ae7cb3-1588-4c70-92e2-942cef9d9b0a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.781675 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" event={"ID":"e7ae7cb3-1588-4c70-92e2-942cef9d9b0a","Type":"ContainerDied","Data":"e3b17876bd6fab12be1c480e05cf30aedaa86e46954f9f43fd2ef6bf8184d628"} Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.781713 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3b17876bd6fab12be1c480e05cf30aedaa86e46954f9f43fd2ef6bf8184d628" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.781729 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4vk2h" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.876565 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr"] Oct 03 14:25:50 crc kubenswrapper[4636]: E1003 14:25:50.877468 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ae7cb3-1588-4c70-92e2-942cef9d9b0a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.877488 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ae7cb3-1588-4c70-92e2-942cef9d9b0a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.877754 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ae7cb3-1588-4c70-92e2-942cef9d9b0a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.881868 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.886299 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.886487 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.886541 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.886636 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:25:50 crc kubenswrapper[4636]: I1003 14:25:50.888401 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr"] Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.069462 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.070194 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.070355 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wq52\" (UniqueName: \"kubernetes.io/projected/57d50548-733b-4696-9e0f-fc749406a055-kube-api-access-2wq52\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.070504 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.172164 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.172271 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.172307 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.172383 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wq52\" (UniqueName: \"kubernetes.io/projected/57d50548-733b-4696-9e0f-fc749406a055-kube-api-access-2wq52\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.177229 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.185378 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.185750 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.195289 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wq52\" (UniqueName: \"kubernetes.io/projected/57d50548-733b-4696-9e0f-fc749406a055-kube-api-access-2wq52\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.218765 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:25:51 crc kubenswrapper[4636]: I1003 14:25:51.816430 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr"] Oct 03 14:25:52 crc kubenswrapper[4636]: I1003 14:25:52.417582 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:52 crc kubenswrapper[4636]: I1003 14:25:52.470129 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:52 crc kubenswrapper[4636]: I1003 14:25:52.660984 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srl5v"] Oct 03 14:25:52 crc kubenswrapper[4636]: I1003 14:25:52.814613 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" event={"ID":"57d50548-733b-4696-9e0f-fc749406a055","Type":"ContainerStarted","Data":"776b63a58b6b8ea22c6bb53ffc999424ece33e62d264490ef5f2a0e179f9b174"} Oct 03 14:25:52 crc kubenswrapper[4636]: I1003 14:25:52.814660 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" event={"ID":"57d50548-733b-4696-9e0f-fc749406a055","Type":"ContainerStarted","Data":"bbc3faf289d929e0c8bc8f6b66101445c2b708b4b0ff6c10ce3f5d82d7bc339f"} Oct 03 14:25:52 crc kubenswrapper[4636]: I1003 14:25:52.829218 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" podStartSLOduration=2.398628141 podStartE2EDuration="2.82919909s" podCreationTimestamp="2025-10-03 14:25:50 +0000 UTC" firstStartedPulling="2025-10-03 14:25:51.817723901 +0000 UTC m=+1501.676450148" lastFinishedPulling="2025-10-03 14:25:52.24829484 +0000 UTC m=+1502.107021097" observedRunningTime="2025-10-03 14:25:52.829034285 +0000 UTC m=+1502.687760532" watchObservedRunningTime="2025-10-03 14:25:52.82919909 +0000 UTC m=+1502.687925337" Oct 03 14:25:53 crc kubenswrapper[4636]: I1003 14:25:53.816720 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-srl5v" podUID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerName="registry-server" containerID="cri-o://db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c" gracePeriod=2 Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.277345 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.449720 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkfmh\" (UniqueName: \"kubernetes.io/projected/dd4d5068-5bad-4992-bd16-07055acb5f19-kube-api-access-hkfmh\") pod \"dd4d5068-5bad-4992-bd16-07055acb5f19\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.450004 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-catalog-content\") pod \"dd4d5068-5bad-4992-bd16-07055acb5f19\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.450291 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-utilities\") pod \"dd4d5068-5bad-4992-bd16-07055acb5f19\" (UID: \"dd4d5068-5bad-4992-bd16-07055acb5f19\") " Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.451499 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-utilities" (OuterVolumeSpecName: "utilities") pod "dd4d5068-5bad-4992-bd16-07055acb5f19" (UID: "dd4d5068-5bad-4992-bd16-07055acb5f19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.457850 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4d5068-5bad-4992-bd16-07055acb5f19-kube-api-access-hkfmh" (OuterVolumeSpecName: "kube-api-access-hkfmh") pod "dd4d5068-5bad-4992-bd16-07055acb5f19" (UID: "dd4d5068-5bad-4992-bd16-07055acb5f19"). InnerVolumeSpecName "kube-api-access-hkfmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.498545 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd4d5068-5bad-4992-bd16-07055acb5f19" (UID: "dd4d5068-5bad-4992-bd16-07055acb5f19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.553816 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.554043 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd4d5068-5bad-4992-bd16-07055acb5f19-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.554158 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkfmh\" (UniqueName: \"kubernetes.io/projected/dd4d5068-5bad-4992-bd16-07055acb5f19-kube-api-access-hkfmh\") on node \"crc\" DevicePath \"\"" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.827649 4636 generic.go:334] "Generic (PLEG): container finished" podID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerID="db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c" exitCode=0 Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.827899 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srl5v" event={"ID":"dd4d5068-5bad-4992-bd16-07055acb5f19","Type":"ContainerDied","Data":"db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c"} Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.827924 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srl5v" event={"ID":"dd4d5068-5bad-4992-bd16-07055acb5f19","Type":"ContainerDied","Data":"97a10d1dee09a4e5389d20d1ef15563988020cf5838f2d59e2ad3fbb0f053ad3"} Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.827940 4636 scope.go:117] "RemoveContainer" containerID="db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.828058 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srl5v" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.857055 4636 scope.go:117] "RemoveContainer" containerID="975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.857317 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srl5v"] Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.865140 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-srl5v"] Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.896971 4636 scope.go:117] "RemoveContainer" containerID="ed32930e9d81b2c339d5f8f6e23c5fb7fbc5226a360e29c2e7b49cab2934827f" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.927734 4636 scope.go:117] "RemoveContainer" containerID="db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c" Oct 03 14:25:54 crc kubenswrapper[4636]: E1003 14:25:54.929249 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c\": container with ID starting with db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c not found: ID does not exist" containerID="db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.929294 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c"} err="failed to get container status \"db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c\": rpc error: code = NotFound desc = could not find container \"db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c\": container with ID starting with db4cb1dd5d512da71e66ac756bf9b2578be3e18c8e39fbb0431332e69f20976c not found: ID does not exist" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.929323 4636 scope.go:117] "RemoveContainer" containerID="975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14" Oct 03 14:25:54 crc kubenswrapper[4636]: E1003 14:25:54.930302 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14\": container with ID starting with 975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14 not found: ID does not exist" containerID="975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.930331 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14"} err="failed to get container status \"975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14\": rpc error: code = NotFound desc = could not find container \"975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14\": container with ID starting with 975a0654eb7ec4d9cb22bf34bd4789d0806b1681eba4e75c30ef6b8fb9aced14 not found: ID does not exist" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.930352 4636 scope.go:117] "RemoveContainer" containerID="ed32930e9d81b2c339d5f8f6e23c5fb7fbc5226a360e29c2e7b49cab2934827f" Oct 03 14:25:54 crc kubenswrapper[4636]: E1003 14:25:54.930949 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed32930e9d81b2c339d5f8f6e23c5fb7fbc5226a360e29c2e7b49cab2934827f\": container with ID starting with ed32930e9d81b2c339d5f8f6e23c5fb7fbc5226a360e29c2e7b49cab2934827f not found: ID does not exist" containerID="ed32930e9d81b2c339d5f8f6e23c5fb7fbc5226a360e29c2e7b49cab2934827f" Oct 03 14:25:54 crc kubenswrapper[4636]: I1003 14:25:54.930976 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed32930e9d81b2c339d5f8f6e23c5fb7fbc5226a360e29c2e7b49cab2934827f"} err="failed to get container status \"ed32930e9d81b2c339d5f8f6e23c5fb7fbc5226a360e29c2e7b49cab2934827f\": rpc error: code = NotFound desc = could not find container \"ed32930e9d81b2c339d5f8f6e23c5fb7fbc5226a360e29c2e7b49cab2934827f\": container with ID starting with ed32930e9d81b2c339d5f8f6e23c5fb7fbc5226a360e29c2e7b49cab2934827f not found: ID does not exist" Oct 03 14:25:56 crc kubenswrapper[4636]: I1003 14:25:56.805384 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd4d5068-5bad-4992-bd16-07055acb5f19" path="/var/lib/kubelet/pods/dd4d5068-5bad-4992-bd16-07055acb5f19/volumes" Oct 03 14:26:09 crc kubenswrapper[4636]: I1003 14:26:09.163209 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:26:09 crc kubenswrapper[4636]: I1003 14:26:09.163841 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:26:39 crc kubenswrapper[4636]: I1003 14:26:39.163019 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:26:39 crc kubenswrapper[4636]: I1003 14:26:39.163727 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:26:39 crc kubenswrapper[4636]: I1003 14:26:39.163788 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:26:39 crc kubenswrapper[4636]: I1003 14:26:39.165090 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:26:39 crc kubenswrapper[4636]: I1003 14:26:39.165224 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" gracePeriod=600 Oct 03 14:26:39 crc kubenswrapper[4636]: E1003 14:26:39.286731 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:26:40 crc kubenswrapper[4636]: I1003 14:26:40.275343 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" exitCode=0 Oct 03 14:26:40 crc kubenswrapper[4636]: I1003 14:26:40.275415 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186"} Oct 03 14:26:40 crc kubenswrapper[4636]: I1003 14:26:40.275782 4636 scope.go:117] "RemoveContainer" containerID="3f35c195de607af5e2083a70ee704e67efe4c37e24910c615f6adb0ee1029e41" Oct 03 14:26:40 crc kubenswrapper[4636]: I1003 14:26:40.276392 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:26:40 crc kubenswrapper[4636]: E1003 14:26:40.276693 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:26:51 crc kubenswrapper[4636]: I1003 14:26:51.793756 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:26:51 crc kubenswrapper[4636]: E1003 14:26:51.795533 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:27:06 crc kubenswrapper[4636]: I1003 14:27:06.799022 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:27:06 crc kubenswrapper[4636]: E1003 14:27:06.801033 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:27:19 crc kubenswrapper[4636]: I1003 14:27:19.794944 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:27:19 crc kubenswrapper[4636]: E1003 14:27:19.795721 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:27:30 crc kubenswrapper[4636]: I1003 14:27:30.801469 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:27:30 crc kubenswrapper[4636]: E1003 14:27:30.802280 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:27:42 crc kubenswrapper[4636]: I1003 14:27:42.793672 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:27:42 crc kubenswrapper[4636]: E1003 14:27:42.794476 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:27:53 crc kubenswrapper[4636]: I1003 14:27:53.793692 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:27:53 crc kubenswrapper[4636]: E1003 14:27:53.794413 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:28:00 crc kubenswrapper[4636]: I1003 14:28:00.043963 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-4pz7d"] Oct 03 14:28:00 crc kubenswrapper[4636]: I1003 14:28:00.055987 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-4pz7d"] Oct 03 14:28:00 crc kubenswrapper[4636]: I1003 14:28:00.805808 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316899b4-4f1f-4065-ae6b-fddfa3c90ab6" path="/var/lib/kubelet/pods/316899b4-4f1f-4065-ae6b-fddfa3c90ab6/volumes" Oct 03 14:28:04 crc kubenswrapper[4636]: I1003 14:28:04.803840 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:28:04 crc kubenswrapper[4636]: E1003 14:28:04.805972 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:28:06 crc kubenswrapper[4636]: I1003 14:28:06.034090 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8zhsb"] Oct 03 14:28:06 crc kubenswrapper[4636]: I1003 14:28:06.041536 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8zhsb"] Oct 03 14:28:06 crc kubenswrapper[4636]: I1003 14:28:06.805609 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b04e574-4d75-478a-a55f-486aab465fa7" path="/var/lib/kubelet/pods/4b04e574-4d75-478a-a55f-486aab465fa7/volumes" Oct 03 14:28:07 crc kubenswrapper[4636]: I1003 14:28:07.966925 4636 scope.go:117] "RemoveContainer" containerID="4fc67d8ab8602e125956c304c66ab466454a22bbcb767f4de347fe516cd3e915" Oct 03 14:28:07 crc kubenswrapper[4636]: I1003 14:28:07.997610 4636 scope.go:117] "RemoveContainer" containerID="7427676545f7fbcc148fe269987c5a6a638bc3eff511865c0ddc8424c989e573" Oct 03 14:28:08 crc kubenswrapper[4636]: I1003 14:28:08.049238 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-pbmsg"] Oct 03 14:28:08 crc kubenswrapper[4636]: I1003 14:28:08.059663 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-zh4x6"] Oct 03 14:28:08 crc kubenswrapper[4636]: I1003 14:28:08.070338 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-jjmrv"] Oct 03 14:28:08 crc kubenswrapper[4636]: I1003 14:28:08.079827 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-pbmsg"] Oct 03 14:28:08 crc kubenswrapper[4636]: I1003 14:28:08.089626 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-zh4x6"] Oct 03 14:28:08 crc kubenswrapper[4636]: I1003 14:28:08.098811 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-jjmrv"] Oct 03 14:28:08 crc kubenswrapper[4636]: I1003 14:28:08.807108 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d1351d-7b6e-4ced-b207-5ec41477a9a6" path="/var/lib/kubelet/pods/15d1351d-7b6e-4ced-b207-5ec41477a9a6/volumes" Oct 03 14:28:08 crc kubenswrapper[4636]: I1003 14:28:08.808549 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39407704-a90e-4ea4-a39b-1ec109994c04" path="/var/lib/kubelet/pods/39407704-a90e-4ea4-a39b-1ec109994c04/volumes" Oct 03 14:28:08 crc kubenswrapper[4636]: I1003 14:28:08.810088 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba05788-5cbc-43bf-90a3-16dd333267d6" path="/var/lib/kubelet/pods/cba05788-5cbc-43bf-90a3-16dd333267d6/volumes" Oct 03 14:28:11 crc kubenswrapper[4636]: I1003 14:28:11.038043 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-9bncb"] Oct 03 14:28:11 crc kubenswrapper[4636]: I1003 14:28:11.049537 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8d56-account-create-m645b"] Oct 03 14:28:11 crc kubenswrapper[4636]: I1003 14:28:11.060080 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8d56-account-create-m645b"] Oct 03 14:28:11 crc kubenswrapper[4636]: I1003 14:28:11.068772 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-9bncb"] Oct 03 14:28:12 crc kubenswrapper[4636]: I1003 14:28:12.805533 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dfd65d1-e6ef-4646-a546-6a03d0443231" path="/var/lib/kubelet/pods/8dfd65d1-e6ef-4646-a546-6a03d0443231/volumes" Oct 03 14:28:12 crc kubenswrapper[4636]: I1003 14:28:12.806201 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f968a95-a0b1-4f56-886b-64674656f645" path="/var/lib/kubelet/pods/8f968a95-a0b1-4f56-886b-64674656f645/volumes" Oct 03 14:28:16 crc kubenswrapper[4636]: I1003 14:28:16.028498 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6e1e-account-create-5f99d"] Oct 03 14:28:16 crc kubenswrapper[4636]: I1003 14:28:16.038091 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6e1e-account-create-5f99d"] Oct 03 14:28:16 crc kubenswrapper[4636]: I1003 14:28:16.804551 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659ed666-3a55-49bb-a35e-a59098f195d0" path="/var/lib/kubelet/pods/659ed666-3a55-49bb-a35e-a59098f195d0/volumes" Oct 03 14:28:17 crc kubenswrapper[4636]: I1003 14:28:17.794808 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:28:17 crc kubenswrapper[4636]: E1003 14:28:17.795046 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:28:18 crc kubenswrapper[4636]: I1003 14:28:18.026826 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-448f-account-create-prkdj"] Oct 03 14:28:18 crc kubenswrapper[4636]: I1003 14:28:18.034770 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c50b-account-create-qn2bs"] Oct 03 14:28:18 crc kubenswrapper[4636]: I1003 14:28:18.045141 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-dc03-account-create-rwrxn"] Oct 03 14:28:18 crc kubenswrapper[4636]: I1003 14:28:18.053505 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-448f-account-create-prkdj"] Oct 03 14:28:18 crc kubenswrapper[4636]: I1003 14:28:18.060818 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c50b-account-create-qn2bs"] Oct 03 14:28:18 crc kubenswrapper[4636]: I1003 14:28:18.067579 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-dc03-account-create-rwrxn"] Oct 03 14:28:18 crc kubenswrapper[4636]: I1003 14:28:18.859123 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd02d0a-c1bc-4e2d-a682-7b2db952669a" path="/var/lib/kubelet/pods/4dd02d0a-c1bc-4e2d-a682-7b2db952669a/volumes" Oct 03 14:28:18 crc kubenswrapper[4636]: I1003 14:28:18.860296 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe7893f-d888-4ac1-8179-4aa5322618f1" path="/var/lib/kubelet/pods/6fe7893f-d888-4ac1-8179-4aa5322618f1/volumes" Oct 03 14:28:18 crc kubenswrapper[4636]: I1003 14:28:18.860820 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="888c174c-e532-4731-a87a-7490a32c8e8b" path="/var/lib/kubelet/pods/888c174c-e532-4731-a87a-7490a32c8e8b/volumes" Oct 03 14:28:21 crc kubenswrapper[4636]: I1003 14:28:21.024907 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4584-account-create-sbqsq"] Oct 03 14:28:21 crc kubenswrapper[4636]: I1003 14:28:21.036456 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4584-account-create-sbqsq"] Oct 03 14:28:22 crc kubenswrapper[4636]: I1003 14:28:22.805147 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1da9d9e1-4437-471a-9a5b-d507a11f1695" path="/var/lib/kubelet/pods/1da9d9e1-4437-471a-9a5b-d507a11f1695/volumes" Oct 03 14:28:30 crc kubenswrapper[4636]: I1003 14:28:30.801782 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:28:30 crc kubenswrapper[4636]: E1003 14:28:30.804823 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:28:41 crc kubenswrapper[4636]: I1003 14:28:41.794195 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:28:41 crc kubenswrapper[4636]: E1003 14:28:41.794979 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:28:47 crc kubenswrapper[4636]: I1003 14:28:47.040183 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8sj76"] Oct 03 14:28:47 crc kubenswrapper[4636]: I1003 14:28:47.050003 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8sj76"] Oct 03 14:28:48 crc kubenswrapper[4636]: I1003 14:28:48.808553 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc578a05-9113-4226-bf5e-a8e907722e8e" path="/var/lib/kubelet/pods/bc578a05-9113-4226-bf5e-a8e907722e8e/volumes" Oct 03 14:28:52 crc kubenswrapper[4636]: I1003 14:28:52.794373 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:28:52 crc kubenswrapper[4636]: E1003 14:28:52.795231 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:28:56 crc kubenswrapper[4636]: I1003 14:28:56.060188 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-mg222"] Oct 03 14:28:56 crc kubenswrapper[4636]: I1003 14:28:56.071133 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-mg222"] Oct 03 14:28:56 crc kubenswrapper[4636]: I1003 14:28:56.807386 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c19545-0af9-461d-bf0b-ba0a08f8dbff" path="/var/lib/kubelet/pods/71c19545-0af9-461d-bf0b-ba0a08f8dbff/volumes" Oct 03 14:29:05 crc kubenswrapper[4636]: I1003 14:29:05.793579 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:29:05 crc kubenswrapper[4636]: E1003 14:29:05.794274 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.133803 4636 scope.go:117] "RemoveContainer" containerID="4e8ccede73b08a34d31fa25ce8f102a724dd4669e4e79415b0f717f35ab25cf2" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.156404 4636 scope.go:117] "RemoveContainer" containerID="59b63ea0120facc0242d39598a9b996e020bfe2e115a1beb366c46b10289c32b" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.203190 4636 scope.go:117] "RemoveContainer" containerID="1445179e9326a4af4ecda546a09280bb29c9007e099827700aa55b99a765e574" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.252808 4636 scope.go:117] "RemoveContainer" containerID="9c3f56e587df9ba67da7828ee218e2c1da5b0a865be8ce95542c9b46c452883e" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.304524 4636 scope.go:117] "RemoveContainer" containerID="b6b0bd47b8175ddb56dec171e0ed222e997d0e28fdae220abee7b845e3fe99e6" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.340370 4636 scope.go:117] "RemoveContainer" containerID="3f7d3e73e29a1e05ce1a3b37f2e7bd7a7ca98b3dfd17338119dc600b7b27ae37" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.381884 4636 scope.go:117] "RemoveContainer" containerID="095eee19630799318fdc6318ba4bb43e071d907e171920f99aa03e47055d3d9f" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.402309 4636 scope.go:117] "RemoveContainer" containerID="f68e0284fb2d29216433a2429b1d2698059756deb224a76feb6646a9c745b834" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.422179 4636 scope.go:117] "RemoveContainer" containerID="1f0f864b1513ec5d23a369230ba27c7d5d37fec35beb05061cc0c47c8b0ddb89" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.450206 4636 scope.go:117] "RemoveContainer" containerID="bd8c444f0feca9e0b1e45ed016bfbdb580cb90fd571e9006cef20983f9ec3930" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.473652 4636 scope.go:117] "RemoveContainer" containerID="ffafdd9752b894ef9001f94845b20b8ae023a574abdb4da1b45b85fbc40d384d" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.492378 4636 scope.go:117] "RemoveContainer" containerID="27a71dd2504c82195b4fdb68f9a984a42f8a81f8be983e256d3c34ec08fccce1" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.511621 4636 scope.go:117] "RemoveContainer" containerID="4ee2dc0da05e851a61daa34a3a7591c2d7340b64d96a80d2c7d38250f1eac576" Oct 03 14:29:08 crc kubenswrapper[4636]: I1003 14:29:08.532527 4636 scope.go:117] "RemoveContainer" containerID="474118c6ef44d0a9437fd488c1de35bb8a64721c2e4f794114bd97ead892e0bb" Oct 03 14:29:18 crc kubenswrapper[4636]: I1003 14:29:18.794038 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:29:18 crc kubenswrapper[4636]: E1003 14:29:18.794788 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:29:23 crc kubenswrapper[4636]: I1003 14:29:23.732306 4636 generic.go:334] "Generic (PLEG): container finished" podID="57d50548-733b-4696-9e0f-fc749406a055" containerID="776b63a58b6b8ea22c6bb53ffc999424ece33e62d264490ef5f2a0e179f9b174" exitCode=0 Oct 03 14:29:23 crc kubenswrapper[4636]: I1003 14:29:23.732472 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" event={"ID":"57d50548-733b-4696-9e0f-fc749406a055","Type":"ContainerDied","Data":"776b63a58b6b8ea22c6bb53ffc999424ece33e62d264490ef5f2a0e179f9b174"} Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.174131 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.363462 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-inventory\") pod \"57d50548-733b-4696-9e0f-fc749406a055\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.363513 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-bootstrap-combined-ca-bundle\") pod \"57d50548-733b-4696-9e0f-fc749406a055\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.363587 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-ssh-key\") pod \"57d50548-733b-4696-9e0f-fc749406a055\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.363629 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wq52\" (UniqueName: \"kubernetes.io/projected/57d50548-733b-4696-9e0f-fc749406a055-kube-api-access-2wq52\") pod \"57d50548-733b-4696-9e0f-fc749406a055\" (UID: \"57d50548-733b-4696-9e0f-fc749406a055\") " Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.369612 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "57d50548-733b-4696-9e0f-fc749406a055" (UID: "57d50548-733b-4696-9e0f-fc749406a055"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.374368 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d50548-733b-4696-9e0f-fc749406a055-kube-api-access-2wq52" (OuterVolumeSpecName: "kube-api-access-2wq52") pod "57d50548-733b-4696-9e0f-fc749406a055" (UID: "57d50548-733b-4696-9e0f-fc749406a055"). InnerVolumeSpecName "kube-api-access-2wq52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.396404 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-inventory" (OuterVolumeSpecName: "inventory") pod "57d50548-733b-4696-9e0f-fc749406a055" (UID: "57d50548-733b-4696-9e0f-fc749406a055"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.398052 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "57d50548-733b-4696-9e0f-fc749406a055" (UID: "57d50548-733b-4696-9e0f-fc749406a055"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.466283 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.466547 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wq52\" (UniqueName: \"kubernetes.io/projected/57d50548-733b-4696-9e0f-fc749406a055-kube-api-access-2wq52\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.466619 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.466683 4636 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d50548-733b-4696-9e0f-fc749406a055-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.750417 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" event={"ID":"57d50548-733b-4696-9e0f-fc749406a055","Type":"ContainerDied","Data":"bbc3faf289d929e0c8bc8f6b66101445c2b708b4b0ff6c10ce3f5d82d7bc339f"} Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.750719 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbc3faf289d929e0c8bc8f6b66101445c2b708b4b0ff6c10ce3f5d82d7bc339f" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.750485 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.880223 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb"] Oct 03 14:29:25 crc kubenswrapper[4636]: E1003 14:29:25.880817 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d50548-733b-4696-9e0f-fc749406a055" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.880841 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d50548-733b-4696-9e0f-fc749406a055" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 14:29:25 crc kubenswrapper[4636]: E1003 14:29:25.880858 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerName="registry-server" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.880867 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerName="registry-server" Oct 03 14:29:25 crc kubenswrapper[4636]: E1003 14:29:25.880881 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerName="extract-content" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.880890 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerName="extract-content" Oct 03 14:29:25 crc kubenswrapper[4636]: E1003 14:29:25.880910 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerName="extract-utilities" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.880918 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerName="extract-utilities" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.882839 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4d5068-5bad-4992-bd16-07055acb5f19" containerName="registry-server" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.882886 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d50548-733b-4696-9e0f-fc749406a055" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.883699 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.890250 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.895167 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.895461 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.895948 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:29:25 crc kubenswrapper[4636]: I1003 14:29:25.966544 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb"] Oct 03 14:29:26 crc kubenswrapper[4636]: I1003 14:29:26.079206 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbkj9\" (UniqueName: \"kubernetes.io/projected/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-kube-api-access-sbkj9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f99bb\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:29:26 crc kubenswrapper[4636]: I1003 14:29:26.079672 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f99bb\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:29:26 crc kubenswrapper[4636]: I1003 14:29:26.079786 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f99bb\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:29:26 crc kubenswrapper[4636]: I1003 14:29:26.181963 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbkj9\" (UniqueName: \"kubernetes.io/projected/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-kube-api-access-sbkj9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f99bb\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:29:26 crc kubenswrapper[4636]: I1003 14:29:26.182829 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f99bb\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:29:26 crc kubenswrapper[4636]: I1003 14:29:26.182897 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f99bb\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:29:26 crc kubenswrapper[4636]: I1003 14:29:26.190886 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f99bb\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:29:26 crc kubenswrapper[4636]: I1003 14:29:26.191921 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f99bb\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:29:26 crc kubenswrapper[4636]: I1003 14:29:26.215843 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbkj9\" (UniqueName: \"kubernetes.io/projected/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-kube-api-access-sbkj9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f99bb\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:29:26 crc kubenswrapper[4636]: I1003 14:29:26.511944 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:29:27 crc kubenswrapper[4636]: I1003 14:29:27.088616 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb"] Oct 03 14:29:27 crc kubenswrapper[4636]: I1003 14:29:27.101794 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:29:27 crc kubenswrapper[4636]: I1003 14:29:27.772510 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" event={"ID":"a1c24630-7d57-45b9-8bdd-fb45d6a74c61","Type":"ContainerStarted","Data":"e985e2a74eb6432ca6308d5dcd28b9eed7725b8a3d6f4283953d1bf0c1021bf8"} Oct 03 14:29:28 crc kubenswrapper[4636]: I1003 14:29:28.784114 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" event={"ID":"a1c24630-7d57-45b9-8bdd-fb45d6a74c61","Type":"ContainerStarted","Data":"0046e811a2f73e8467cfde0801b9a43601ec71a46cb7764c97dd3c2ebe0470e5"} Oct 03 14:29:28 crc kubenswrapper[4636]: I1003 14:29:28.819890 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" podStartSLOduration=2.7997060940000003 podStartE2EDuration="3.819870266s" podCreationTimestamp="2025-10-03 14:29:25 +0000 UTC" firstStartedPulling="2025-10-03 14:29:27.101477962 +0000 UTC m=+1716.960204209" lastFinishedPulling="2025-10-03 14:29:28.121642134 +0000 UTC m=+1717.980368381" observedRunningTime="2025-10-03 14:29:28.801510949 +0000 UTC m=+1718.660237206" watchObservedRunningTime="2025-10-03 14:29:28.819870266 +0000 UTC m=+1718.678596513" Oct 03 14:29:29 crc kubenswrapper[4636]: I1003 14:29:29.793446 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:29:29 crc kubenswrapper[4636]: E1003 14:29:29.793962 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:29:43 crc kubenswrapper[4636]: I1003 14:29:43.793710 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:29:43 crc kubenswrapper[4636]: E1003 14:29:43.794756 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:29:58 crc kubenswrapper[4636]: I1003 14:29:58.793170 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:29:58 crc kubenswrapper[4636]: E1003 14:29:58.793848 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:29:59 crc kubenswrapper[4636]: I1003 14:29:59.042642 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-254t7"] Oct 03 14:29:59 crc kubenswrapper[4636]: I1003 14:29:59.053885 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-254t7"] Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.149530 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj"] Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.151031 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.153342 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.153620 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.165620 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj"] Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.319007 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e6d7770-9370-4750-b396-038328ae41ef-config-volume\") pod \"collect-profiles-29325030-sk2kj\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.319140 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c76cl\" (UniqueName: \"kubernetes.io/projected/2e6d7770-9370-4750-b396-038328ae41ef-kube-api-access-c76cl\") pod \"collect-profiles-29325030-sk2kj\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.319189 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e6d7770-9370-4750-b396-038328ae41ef-secret-volume\") pod \"collect-profiles-29325030-sk2kj\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.420925 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c76cl\" (UniqueName: \"kubernetes.io/projected/2e6d7770-9370-4750-b396-038328ae41ef-kube-api-access-c76cl\") pod \"collect-profiles-29325030-sk2kj\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.421003 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e6d7770-9370-4750-b396-038328ae41ef-secret-volume\") pod \"collect-profiles-29325030-sk2kj\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.421110 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e6d7770-9370-4750-b396-038328ae41ef-config-volume\") pod \"collect-profiles-29325030-sk2kj\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.421940 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e6d7770-9370-4750-b396-038328ae41ef-config-volume\") pod \"collect-profiles-29325030-sk2kj\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.434837 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e6d7770-9370-4750-b396-038328ae41ef-secret-volume\") pod \"collect-profiles-29325030-sk2kj\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.437862 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c76cl\" (UniqueName: \"kubernetes.io/projected/2e6d7770-9370-4750-b396-038328ae41ef-kube-api-access-c76cl\") pod \"collect-profiles-29325030-sk2kj\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.470790 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.807034 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e686877-1f2c-4049-8f72-2788c4ff74b8" path="/var/lib/kubelet/pods/9e686877-1f2c-4049-8f72-2788c4ff74b8/volumes" Oct 03 14:30:00 crc kubenswrapper[4636]: I1003 14:30:00.964186 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj"] Oct 03 14:30:01 crc kubenswrapper[4636]: I1003 14:30:01.055775 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" event={"ID":"2e6d7770-9370-4750-b396-038328ae41ef","Type":"ContainerStarted","Data":"3f83435ed4a73e9efd0ae4ebef54d46c6c569b3fa915f2794f032c05ce997c18"} Oct 03 14:30:02 crc kubenswrapper[4636]: I1003 14:30:02.064654 4636 generic.go:334] "Generic (PLEG): container finished" podID="2e6d7770-9370-4750-b396-038328ae41ef" containerID="bae38e860cb253b42df6dcc76be10af40d8122d58598cd3c13d22cf4590659ad" exitCode=0 Oct 03 14:30:02 crc kubenswrapper[4636]: I1003 14:30:02.064865 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" event={"ID":"2e6d7770-9370-4750-b396-038328ae41ef","Type":"ContainerDied","Data":"bae38e860cb253b42df6dcc76be10af40d8122d58598cd3c13d22cf4590659ad"} Oct 03 14:30:03 crc kubenswrapper[4636]: I1003 14:30:03.416349 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:03 crc kubenswrapper[4636]: I1003 14:30:03.583903 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e6d7770-9370-4750-b396-038328ae41ef-config-volume\") pod \"2e6d7770-9370-4750-b396-038328ae41ef\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " Oct 03 14:30:03 crc kubenswrapper[4636]: I1003 14:30:03.583956 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e6d7770-9370-4750-b396-038328ae41ef-secret-volume\") pod \"2e6d7770-9370-4750-b396-038328ae41ef\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " Oct 03 14:30:03 crc kubenswrapper[4636]: I1003 14:30:03.584057 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c76cl\" (UniqueName: \"kubernetes.io/projected/2e6d7770-9370-4750-b396-038328ae41ef-kube-api-access-c76cl\") pod \"2e6d7770-9370-4750-b396-038328ae41ef\" (UID: \"2e6d7770-9370-4750-b396-038328ae41ef\") " Oct 03 14:30:03 crc kubenswrapper[4636]: I1003 14:30:03.584736 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e6d7770-9370-4750-b396-038328ae41ef-config-volume" (OuterVolumeSpecName: "config-volume") pod "2e6d7770-9370-4750-b396-038328ae41ef" (UID: "2e6d7770-9370-4750-b396-038328ae41ef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:30:03 crc kubenswrapper[4636]: I1003 14:30:03.585096 4636 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e6d7770-9370-4750-b396-038328ae41ef-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:30:03 crc kubenswrapper[4636]: I1003 14:30:03.590331 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6d7770-9370-4750-b396-038328ae41ef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2e6d7770-9370-4750-b396-038328ae41ef" (UID: "2e6d7770-9370-4750-b396-038328ae41ef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:30:03 crc kubenswrapper[4636]: I1003 14:30:03.590475 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6d7770-9370-4750-b396-038328ae41ef-kube-api-access-c76cl" (OuterVolumeSpecName: "kube-api-access-c76cl") pod "2e6d7770-9370-4750-b396-038328ae41ef" (UID: "2e6d7770-9370-4750-b396-038328ae41ef"). InnerVolumeSpecName "kube-api-access-c76cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:30:03 crc kubenswrapper[4636]: I1003 14:30:03.686740 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c76cl\" (UniqueName: \"kubernetes.io/projected/2e6d7770-9370-4750-b396-038328ae41ef-kube-api-access-c76cl\") on node \"crc\" DevicePath \"\"" Oct 03 14:30:03 crc kubenswrapper[4636]: I1003 14:30:03.687044 4636 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e6d7770-9370-4750-b396-038328ae41ef-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:30:04 crc kubenswrapper[4636]: I1003 14:30:04.082595 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" event={"ID":"2e6d7770-9370-4750-b396-038328ae41ef","Type":"ContainerDied","Data":"3f83435ed4a73e9efd0ae4ebef54d46c6c569b3fa915f2794f032c05ce997c18"} Oct 03 14:30:04 crc kubenswrapper[4636]: I1003 14:30:04.082629 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f83435ed4a73e9efd0ae4ebef54d46c6c569b3fa915f2794f032c05ce997c18" Oct 03 14:30:04 crc kubenswrapper[4636]: I1003 14:30:04.082634 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj" Oct 03 14:30:08 crc kubenswrapper[4636]: I1003 14:30:08.023945 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-hv2wz"] Oct 03 14:30:08 crc kubenswrapper[4636]: I1003 14:30:08.031971 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-hv2wz"] Oct 03 14:30:08 crc kubenswrapper[4636]: I1003 14:30:08.804932 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e452cc-6659-4abd-88ff-d9e731b9b1ef" path="/var/lib/kubelet/pods/02e452cc-6659-4abd-88ff-d9e731b9b1ef/volumes" Oct 03 14:30:08 crc kubenswrapper[4636]: I1003 14:30:08.813238 4636 scope.go:117] "RemoveContainer" containerID="e186ebaf19345cada0ff2fcd09f9e68a719a10444ee9ec1f7097c7a5988eb3eb" Oct 03 14:30:08 crc kubenswrapper[4636]: I1003 14:30:08.840825 4636 scope.go:117] "RemoveContainer" containerID="a895fd68ed7ee228433cf4e51a3f47a322af9549d5459155ccc3bc9373dd2cd5" Oct 03 14:30:10 crc kubenswrapper[4636]: I1003 14:30:10.044223 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-5dc9p"] Oct 03 14:30:10 crc kubenswrapper[4636]: I1003 14:30:10.052517 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-5dc9p"] Oct 03 14:30:10 crc kubenswrapper[4636]: I1003 14:30:10.804867 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb04b62-9d0b-4dda-aff1-022bed4af5b4" path="/var/lib/kubelet/pods/0eb04b62-9d0b-4dda-aff1-022bed4af5b4/volumes" Oct 03 14:30:11 crc kubenswrapper[4636]: I1003 14:30:11.794550 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:30:11 crc kubenswrapper[4636]: E1003 14:30:11.794927 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:30:12 crc kubenswrapper[4636]: I1003 14:30:12.037038 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-97znq"] Oct 03 14:30:12 crc kubenswrapper[4636]: I1003 14:30:12.044857 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-97znq"] Oct 03 14:30:12 crc kubenswrapper[4636]: I1003 14:30:12.805599 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d2a38ef-2fad-4a66-a131-2f690ceb72f1" path="/var/lib/kubelet/pods/7d2a38ef-2fad-4a66-a131-2f690ceb72f1/volumes" Oct 03 14:30:23 crc kubenswrapper[4636]: I1003 14:30:23.033801 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-zgg4h"] Oct 03 14:30:23 crc kubenswrapper[4636]: I1003 14:30:23.045409 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-zgg4h"] Oct 03 14:30:24 crc kubenswrapper[4636]: I1003 14:30:24.805527 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49" path="/var/lib/kubelet/pods/48ac3e13-8b1e-43c8-95cd-d6c2eb1fab49/volumes" Oct 03 14:30:26 crc kubenswrapper[4636]: I1003 14:30:26.794335 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:30:26 crc kubenswrapper[4636]: E1003 14:30:26.794943 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:30:41 crc kubenswrapper[4636]: I1003 14:30:41.793628 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:30:41 crc kubenswrapper[4636]: E1003 14:30:41.794400 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:30:56 crc kubenswrapper[4636]: I1003 14:30:56.793585 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:30:56 crc kubenswrapper[4636]: E1003 14:30:56.794486 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:31:08 crc kubenswrapper[4636]: I1003 14:31:08.957196 4636 scope.go:117] "RemoveContainer" containerID="f0629691999792afdb898c8aa1583f7b512e37775489d5965de17947ce8238f7" Oct 03 14:31:08 crc kubenswrapper[4636]: I1003 14:31:08.984593 4636 scope.go:117] "RemoveContainer" containerID="18a48e0b0583ccc1d96ce521f6e96af8c5680842f0f276483d011fab120d4498" Oct 03 14:31:09 crc kubenswrapper[4636]: I1003 14:31:09.032042 4636 scope.go:117] "RemoveContainer" containerID="ea54d616c2574efe0fca015fd7163ec725f13d0b9b15a84b7a1a06c0339a7c22" Oct 03 14:31:11 crc kubenswrapper[4636]: I1003 14:31:11.795600 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:31:11 crc kubenswrapper[4636]: E1003 14:31:11.796998 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:31:22 crc kubenswrapper[4636]: I1003 14:31:22.794267 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:31:22 crc kubenswrapper[4636]: E1003 14:31:22.794979 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:31:29 crc kubenswrapper[4636]: I1003 14:31:29.039606 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-fj8lz"] Oct 03 14:31:29 crc kubenswrapper[4636]: I1003 14:31:29.050087 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rbdk4"] Oct 03 14:31:29 crc kubenswrapper[4636]: I1003 14:31:29.061087 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-sfqrs"] Oct 03 14:31:29 crc kubenswrapper[4636]: I1003 14:31:29.072251 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-sfqrs"] Oct 03 14:31:29 crc kubenswrapper[4636]: I1003 14:31:29.083682 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-fj8lz"] Oct 03 14:31:29 crc kubenswrapper[4636]: I1003 14:31:29.090583 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rbdk4"] Oct 03 14:31:30 crc kubenswrapper[4636]: I1003 14:31:30.807996 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e8d634-a8e9-43d5-ac01-e640cc209af3" path="/var/lib/kubelet/pods/79e8d634-a8e9-43d5-ac01-e640cc209af3/volumes" Oct 03 14:31:30 crc kubenswrapper[4636]: I1003 14:31:30.810572 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f746edf-871e-487f-96f0-d640ee2e9266" path="/var/lib/kubelet/pods/7f746edf-871e-487f-96f0-d640ee2e9266/volumes" Oct 03 14:31:30 crc kubenswrapper[4636]: I1003 14:31:30.811649 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f762c138-feda-4a6d-8d07-dfcbb5efaf4d" path="/var/lib/kubelet/pods/f762c138-feda-4a6d-8d07-dfcbb5efaf4d/volumes" Oct 03 14:31:34 crc kubenswrapper[4636]: I1003 14:31:34.793679 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:31:34 crc kubenswrapper[4636]: E1003 14:31:34.794440 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:31:39 crc kubenswrapper[4636]: I1003 14:31:39.029612 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cc4c-account-create-2g6t4"] Oct 03 14:31:39 crc kubenswrapper[4636]: I1003 14:31:39.038327 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cc4c-account-create-2g6t4"] Oct 03 14:31:39 crc kubenswrapper[4636]: I1003 14:31:39.052906 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c311-account-create-s5q7f"] Oct 03 14:31:39 crc kubenswrapper[4636]: I1003 14:31:39.059836 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3116-account-create-hc7k2"] Oct 03 14:31:39 crc kubenswrapper[4636]: I1003 14:31:39.066573 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c311-account-create-s5q7f"] Oct 03 14:31:39 crc kubenswrapper[4636]: I1003 14:31:39.072723 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3116-account-create-hc7k2"] Oct 03 14:31:40 crc kubenswrapper[4636]: I1003 14:31:40.810059 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dcc76fa-3c4f-4196-b6b8-3add9559c134" path="/var/lib/kubelet/pods/1dcc76fa-3c4f-4196-b6b8-3add9559c134/volumes" Oct 03 14:31:40 crc kubenswrapper[4636]: I1003 14:31:40.810830 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4" path="/var/lib/kubelet/pods/1ddc23ff-72a1-4527-864e-0cd4fd4b3cb4/volumes" Oct 03 14:31:40 crc kubenswrapper[4636]: I1003 14:31:40.812219 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549b8746-9d8a-4f99-82b7-e03650acb897" path="/var/lib/kubelet/pods/549b8746-9d8a-4f99-82b7-e03650acb897/volumes" Oct 03 14:31:47 crc kubenswrapper[4636]: I1003 14:31:47.794308 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:31:48 crc kubenswrapper[4636]: I1003 14:31:48.029561 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"6bf347f17c6a57808711c5d59e3068eaceb6db558930f7501a3d5cd85b4d3b8e"} Oct 03 14:32:05 crc kubenswrapper[4636]: I1003 14:32:05.049753 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rw5t7"] Oct 03 14:32:05 crc kubenswrapper[4636]: I1003 14:32:05.057568 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rw5t7"] Oct 03 14:32:06 crc kubenswrapper[4636]: I1003 14:32:06.807048 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb2e76a1-5457-484c-b311-c46b1eecec12" path="/var/lib/kubelet/pods/fb2e76a1-5457-484c-b311-c46b1eecec12/volumes" Oct 03 14:32:09 crc kubenswrapper[4636]: I1003 14:32:09.136709 4636 scope.go:117] "RemoveContainer" containerID="c5be79b6e2407ac3e8070c2de882f889bd58565c08bea0179ce362e5016be510" Oct 03 14:32:09 crc kubenswrapper[4636]: I1003 14:32:09.184027 4636 scope.go:117] "RemoveContainer" containerID="a7bfdb1098cf857476b93f5a191c1c86c969d42dedaa18fe09b18e1ed16a0035" Oct 03 14:32:09 crc kubenswrapper[4636]: I1003 14:32:09.213023 4636 scope.go:117] "RemoveContainer" containerID="06420344c54af020ec13a167ce7aa2de36ddaaf0ab59509bc6346fa8e6105a3b" Oct 03 14:32:09 crc kubenswrapper[4636]: I1003 14:32:09.252561 4636 scope.go:117] "RemoveContainer" containerID="c69218f29980ade8c4fee05ca8263c09b390f7eb796d9667a66e6a558c52d546" Oct 03 14:32:09 crc kubenswrapper[4636]: I1003 14:32:09.295349 4636 scope.go:117] "RemoveContainer" containerID="f5b92446f305154875fc571b2ce39d3bb9cd0c0963f22bc9fed2395e1c63753d" Oct 03 14:32:09 crc kubenswrapper[4636]: I1003 14:32:09.342378 4636 scope.go:117] "RemoveContainer" containerID="b3d48089cf8ab9fa3275569fa4cf6f9bffbac37c67b2e708d5b5cf276d0eb1f5" Oct 03 14:32:09 crc kubenswrapper[4636]: I1003 14:32:09.381456 4636 scope.go:117] "RemoveContainer" containerID="62e23a8427ddb727a81b774a9ad6a01057905ac1eda53c868bbf61793a4ec9b2" Oct 03 14:32:29 crc kubenswrapper[4636]: I1003 14:32:29.037536 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9l45j"] Oct 03 14:32:29 crc kubenswrapper[4636]: I1003 14:32:29.044903 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9l45j"] Oct 03 14:32:30 crc kubenswrapper[4636]: I1003 14:32:30.805636 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f5bfa77-9c30-4f65-900b-62595074b467" path="/var/lib/kubelet/pods/7f5bfa77-9c30-4f65-900b-62595074b467/volumes" Oct 03 14:32:34 crc kubenswrapper[4636]: I1003 14:32:34.029325 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9b2q2"] Oct 03 14:32:34 crc kubenswrapper[4636]: I1003 14:32:34.036964 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9b2q2"] Oct 03 14:32:34 crc kubenswrapper[4636]: I1003 14:32:34.810558 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5" path="/var/lib/kubelet/pods/60cd2a84-fbf1-43f7-ad9e-0bb5f83f66d5/volumes" Oct 03 14:32:41 crc kubenswrapper[4636]: I1003 14:32:41.480241 4636 generic.go:334] "Generic (PLEG): container finished" podID="a1c24630-7d57-45b9-8bdd-fb45d6a74c61" containerID="0046e811a2f73e8467cfde0801b9a43601ec71a46cb7764c97dd3c2ebe0470e5" exitCode=0 Oct 03 14:32:41 crc kubenswrapper[4636]: I1003 14:32:41.480425 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" event={"ID":"a1c24630-7d57-45b9-8bdd-fb45d6a74c61","Type":"ContainerDied","Data":"0046e811a2f73e8467cfde0801b9a43601ec71a46cb7764c97dd3c2ebe0470e5"} Oct 03 14:32:42 crc kubenswrapper[4636]: I1003 14:32:42.918497 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.030347 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-ssh-key\") pod \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.030399 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbkj9\" (UniqueName: \"kubernetes.io/projected/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-kube-api-access-sbkj9\") pod \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.030479 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-inventory\") pod \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\" (UID: \"a1c24630-7d57-45b9-8bdd-fb45d6a74c61\") " Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.041325 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-kube-api-access-sbkj9" (OuterVolumeSpecName: "kube-api-access-sbkj9") pod "a1c24630-7d57-45b9-8bdd-fb45d6a74c61" (UID: "a1c24630-7d57-45b9-8bdd-fb45d6a74c61"). InnerVolumeSpecName "kube-api-access-sbkj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.062195 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a1c24630-7d57-45b9-8bdd-fb45d6a74c61" (UID: "a1c24630-7d57-45b9-8bdd-fb45d6a74c61"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.064385 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-inventory" (OuterVolumeSpecName: "inventory") pod "a1c24630-7d57-45b9-8bdd-fb45d6a74c61" (UID: "a1c24630-7d57-45b9-8bdd-fb45d6a74c61"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.132785 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.132814 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbkj9\" (UniqueName: \"kubernetes.io/projected/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-kube-api-access-sbkj9\") on node \"crc\" DevicePath \"\"" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.132824 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1c24630-7d57-45b9-8bdd-fb45d6a74c61-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.495658 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" event={"ID":"a1c24630-7d57-45b9-8bdd-fb45d6a74c61","Type":"ContainerDied","Data":"e985e2a74eb6432ca6308d5dcd28b9eed7725b8a3d6f4283953d1bf0c1021bf8"} Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.495696 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e985e2a74eb6432ca6308d5dcd28b9eed7725b8a3d6f4283953d1bf0c1021bf8" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.495745 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f99bb" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.583342 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd"] Oct 03 14:32:43 crc kubenswrapper[4636]: E1003 14:32:43.584215 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6d7770-9370-4750-b396-038328ae41ef" containerName="collect-profiles" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.584350 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6d7770-9370-4750-b396-038328ae41ef" containerName="collect-profiles" Oct 03 14:32:43 crc kubenswrapper[4636]: E1003 14:32:43.584531 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c24630-7d57-45b9-8bdd-fb45d6a74c61" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.584632 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c24630-7d57-45b9-8bdd-fb45d6a74c61" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.584965 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6d7770-9370-4750-b396-038328ae41ef" containerName="collect-profiles" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.585132 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c24630-7d57-45b9-8bdd-fb45d6a74c61" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.586045 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.590995 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.591083 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.591178 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.591284 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.599716 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd"] Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.741327 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qplrd\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.741386 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pllsj\" (UniqueName: \"kubernetes.io/projected/9781ac24-d39e-4e00-b2e8-3eac5f120090-kube-api-access-pllsj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qplrd\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.741463 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qplrd\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.842634 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qplrd\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.842789 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qplrd\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.842834 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pllsj\" (UniqueName: \"kubernetes.io/projected/9781ac24-d39e-4e00-b2e8-3eac5f120090-kube-api-access-pllsj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qplrd\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.851865 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qplrd\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.851874 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qplrd\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.861063 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pllsj\" (UniqueName: \"kubernetes.io/projected/9781ac24-d39e-4e00-b2e8-3eac5f120090-kube-api-access-pllsj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qplrd\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:32:43 crc kubenswrapper[4636]: I1003 14:32:43.910782 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:32:44 crc kubenswrapper[4636]: I1003 14:32:44.414240 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd"] Oct 03 14:32:44 crc kubenswrapper[4636]: I1003 14:32:44.506874 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" event={"ID":"9781ac24-d39e-4e00-b2e8-3eac5f120090","Type":"ContainerStarted","Data":"d439cfa9f3147d25d9ed356868846601f2741681aea873db1ff2b0b035da1a37"} Oct 03 14:32:45 crc kubenswrapper[4636]: I1003 14:32:45.517737 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" event={"ID":"9781ac24-d39e-4e00-b2e8-3eac5f120090","Type":"ContainerStarted","Data":"9a22058ec64cbacc691f35ae1d79c76304a88f882385027690e82bfcfbb7da97"} Oct 03 14:32:45 crc kubenswrapper[4636]: I1003 14:32:45.535441 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" podStartSLOduration=2.126668877 podStartE2EDuration="2.535423257s" podCreationTimestamp="2025-10-03 14:32:43 +0000 UTC" firstStartedPulling="2025-10-03 14:32:44.4333339 +0000 UTC m=+1914.292060147" lastFinishedPulling="2025-10-03 14:32:44.84208828 +0000 UTC m=+1914.700814527" observedRunningTime="2025-10-03 14:32:45.532987764 +0000 UTC m=+1915.391714011" watchObservedRunningTime="2025-10-03 14:32:45.535423257 +0000 UTC m=+1915.394149504" Oct 03 14:33:09 crc kubenswrapper[4636]: I1003 14:33:09.515585 4636 scope.go:117] "RemoveContainer" containerID="da82b036efdbc7e73e12a5ebefa44760b78b445f536f231f90c1a47b60593634" Oct 03 14:33:09 crc kubenswrapper[4636]: I1003 14:33:09.550708 4636 scope.go:117] "RemoveContainer" containerID="cf31480d5894ef92aab08f7c038b7f96f19f5cdbe5835559a8bf45bff7ecda9d" Oct 03 14:33:16 crc kubenswrapper[4636]: I1003 14:33:16.038668 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-f4zgp"] Oct 03 14:33:16 crc kubenswrapper[4636]: I1003 14:33:16.046805 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-f4zgp"] Oct 03 14:33:16 crc kubenswrapper[4636]: I1003 14:33:16.806132 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78747294-969f-4563-9d83-19f46b0045aa" path="/var/lib/kubelet/pods/78747294-969f-4563-9d83-19f46b0045aa/volumes" Oct 03 14:33:38 crc kubenswrapper[4636]: I1003 14:33:38.959419 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9qkpd"] Oct 03 14:33:38 crc kubenswrapper[4636]: I1003 14:33:38.962534 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:38 crc kubenswrapper[4636]: I1003 14:33:38.975841 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9qkpd"] Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.115765 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp5sh\" (UniqueName: \"kubernetes.io/projected/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-kube-api-access-cp5sh\") pod \"redhat-operators-9qkpd\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.115857 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-catalog-content\") pod \"redhat-operators-9qkpd\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.116033 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-utilities\") pod \"redhat-operators-9qkpd\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.217312 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-utilities\") pod \"redhat-operators-9qkpd\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.217403 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp5sh\" (UniqueName: \"kubernetes.io/projected/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-kube-api-access-cp5sh\") pod \"redhat-operators-9qkpd\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.217435 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-catalog-content\") pod \"redhat-operators-9qkpd\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.217801 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-utilities\") pod \"redhat-operators-9qkpd\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.217860 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-catalog-content\") pod \"redhat-operators-9qkpd\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.238960 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp5sh\" (UniqueName: \"kubernetes.io/projected/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-kube-api-access-cp5sh\") pod \"redhat-operators-9qkpd\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.285156 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.754342 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9qkpd"] Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.990507 4636 generic.go:334] "Generic (PLEG): container finished" podID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerID="832343ca546e8f26f3e696e93c4d92644a3dd497a084c5b7d66ebc8fe7c27eba" exitCode=0 Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.990802 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qkpd" event={"ID":"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4","Type":"ContainerDied","Data":"832343ca546e8f26f3e696e93c4d92644a3dd497a084c5b7d66ebc8fe7c27eba"} Oct 03 14:33:39 crc kubenswrapper[4636]: I1003 14:33:39.990828 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qkpd" event={"ID":"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4","Type":"ContainerStarted","Data":"f3727504eb3e902d6037f0083460fd41539282788668e09f84edff61c5315584"} Oct 03 14:33:42 crc kubenswrapper[4636]: I1003 14:33:42.015342 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qkpd" event={"ID":"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4","Type":"ContainerStarted","Data":"1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab"} Oct 03 14:33:48 crc kubenswrapper[4636]: I1003 14:33:48.060593 4636 generic.go:334] "Generic (PLEG): container finished" podID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerID="1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab" exitCode=0 Oct 03 14:33:48 crc kubenswrapper[4636]: I1003 14:33:48.060676 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qkpd" event={"ID":"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4","Type":"ContainerDied","Data":"1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab"} Oct 03 14:33:49 crc kubenswrapper[4636]: I1003 14:33:49.071068 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qkpd" event={"ID":"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4","Type":"ContainerStarted","Data":"f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6"} Oct 03 14:33:49 crc kubenswrapper[4636]: I1003 14:33:49.093716 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9qkpd" podStartSLOduration=2.583876283 podStartE2EDuration="11.093698281s" podCreationTimestamp="2025-10-03 14:33:38 +0000 UTC" firstStartedPulling="2025-10-03 14:33:39.992715083 +0000 UTC m=+1969.851441330" lastFinishedPulling="2025-10-03 14:33:48.502537081 +0000 UTC m=+1978.361263328" observedRunningTime="2025-10-03 14:33:49.086817974 +0000 UTC m=+1978.945544231" watchObservedRunningTime="2025-10-03 14:33:49.093698281 +0000 UTC m=+1978.952424528" Oct 03 14:33:49 crc kubenswrapper[4636]: I1003 14:33:49.285845 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:49 crc kubenswrapper[4636]: I1003 14:33:49.286194 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:33:50 crc kubenswrapper[4636]: I1003 14:33:50.334223 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9qkpd" podUID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerName="registry-server" probeResult="failure" output=< Oct 03 14:33:50 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 14:33:50 crc kubenswrapper[4636]: > Oct 03 14:34:00 crc kubenswrapper[4636]: I1003 14:34:00.330746 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9qkpd" podUID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerName="registry-server" probeResult="failure" output=< Oct 03 14:34:00 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 14:34:00 crc kubenswrapper[4636]: > Oct 03 14:34:04 crc kubenswrapper[4636]: I1003 14:34:04.189183 4636 generic.go:334] "Generic (PLEG): container finished" podID="9781ac24-d39e-4e00-b2e8-3eac5f120090" containerID="9a22058ec64cbacc691f35ae1d79c76304a88f882385027690e82bfcfbb7da97" exitCode=0 Oct 03 14:34:04 crc kubenswrapper[4636]: I1003 14:34:04.189230 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" event={"ID":"9781ac24-d39e-4e00-b2e8-3eac5f120090","Type":"ContainerDied","Data":"9a22058ec64cbacc691f35ae1d79c76304a88f882385027690e82bfcfbb7da97"} Oct 03 14:34:05 crc kubenswrapper[4636]: I1003 14:34:05.578778 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:34:05 crc kubenswrapper[4636]: I1003 14:34:05.761670 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pllsj\" (UniqueName: \"kubernetes.io/projected/9781ac24-d39e-4e00-b2e8-3eac5f120090-kube-api-access-pllsj\") pod \"9781ac24-d39e-4e00-b2e8-3eac5f120090\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " Oct 03 14:34:05 crc kubenswrapper[4636]: I1003 14:34:05.761808 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-inventory\") pod \"9781ac24-d39e-4e00-b2e8-3eac5f120090\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " Oct 03 14:34:05 crc kubenswrapper[4636]: I1003 14:34:05.761978 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-ssh-key\") pod \"9781ac24-d39e-4e00-b2e8-3eac5f120090\" (UID: \"9781ac24-d39e-4e00-b2e8-3eac5f120090\") " Oct 03 14:34:05 crc kubenswrapper[4636]: I1003 14:34:05.771322 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9781ac24-d39e-4e00-b2e8-3eac5f120090-kube-api-access-pllsj" (OuterVolumeSpecName: "kube-api-access-pllsj") pod "9781ac24-d39e-4e00-b2e8-3eac5f120090" (UID: "9781ac24-d39e-4e00-b2e8-3eac5f120090"). InnerVolumeSpecName "kube-api-access-pllsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:34:05 crc kubenswrapper[4636]: I1003 14:34:05.790069 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9781ac24-d39e-4e00-b2e8-3eac5f120090" (UID: "9781ac24-d39e-4e00-b2e8-3eac5f120090"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:05 crc kubenswrapper[4636]: I1003 14:34:05.797658 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-inventory" (OuterVolumeSpecName: "inventory") pod "9781ac24-d39e-4e00-b2e8-3eac5f120090" (UID: "9781ac24-d39e-4e00-b2e8-3eac5f120090"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:05 crc kubenswrapper[4636]: I1003 14:34:05.863687 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:05 crc kubenswrapper[4636]: I1003 14:34:05.863718 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pllsj\" (UniqueName: \"kubernetes.io/projected/9781ac24-d39e-4e00-b2e8-3eac5f120090-kube-api-access-pllsj\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:05 crc kubenswrapper[4636]: I1003 14:34:05.863728 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9781ac24-d39e-4e00-b2e8-3eac5f120090-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.207143 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" event={"ID":"9781ac24-d39e-4e00-b2e8-3eac5f120090","Type":"ContainerDied","Data":"d439cfa9f3147d25d9ed356868846601f2741681aea873db1ff2b0b035da1a37"} Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.207365 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d439cfa9f3147d25d9ed356868846601f2741681aea873db1ff2b0b035da1a37" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.207171 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qplrd" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.282850 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64"] Oct 03 14:34:06 crc kubenswrapper[4636]: E1003 14:34:06.284982 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9781ac24-d39e-4e00-b2e8-3eac5f120090" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.285013 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9781ac24-d39e-4e00-b2e8-3eac5f120090" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.285347 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="9781ac24-d39e-4e00-b2e8-3eac5f120090" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.286083 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.288814 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.291379 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.291609 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.291764 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.302289 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64"] Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.372612 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cm8p\" (UniqueName: \"kubernetes.io/projected/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-kube-api-access-4cm8p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2jt64\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.372763 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2jt64\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.372883 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2jt64\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.475289 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2jt64\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.475490 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2jt64\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.475566 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cm8p\" (UniqueName: \"kubernetes.io/projected/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-kube-api-access-4cm8p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2jt64\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.483041 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2jt64\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.484585 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2jt64\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.507726 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cm8p\" (UniqueName: \"kubernetes.io/projected/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-kube-api-access-4cm8p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2jt64\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:06 crc kubenswrapper[4636]: I1003 14:34:06.620642 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:07 crc kubenswrapper[4636]: I1003 14:34:07.156534 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64"] Oct 03 14:34:07 crc kubenswrapper[4636]: I1003 14:34:07.214995 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" event={"ID":"baf6dabc-cac4-4e7c-9101-dcd5cfe39647","Type":"ContainerStarted","Data":"fde8071f09f530c3e34cf2a4d3aafd2799572199eddc2684ba0f31cc1e25310c"} Oct 03 14:34:08 crc kubenswrapper[4636]: I1003 14:34:08.223536 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" event={"ID":"baf6dabc-cac4-4e7c-9101-dcd5cfe39647","Type":"ContainerStarted","Data":"63773c8cc2bb3065ddc2241cf4b2e9dba94f3970ab05f14fa1d3762239c93224"} Oct 03 14:34:08 crc kubenswrapper[4636]: I1003 14:34:08.241878 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" podStartSLOduration=1.803887314 podStartE2EDuration="2.241854874s" podCreationTimestamp="2025-10-03 14:34:06 +0000 UTC" firstStartedPulling="2025-10-03 14:34:07.163335774 +0000 UTC m=+1997.022062021" lastFinishedPulling="2025-10-03 14:34:07.601303334 +0000 UTC m=+1997.460029581" observedRunningTime="2025-10-03 14:34:08.236857536 +0000 UTC m=+1998.095583783" watchObservedRunningTime="2025-10-03 14:34:08.241854874 +0000 UTC m=+1998.100581121" Oct 03 14:34:09 crc kubenswrapper[4636]: I1003 14:34:09.163222 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:34:09 crc kubenswrapper[4636]: I1003 14:34:09.163586 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:34:09 crc kubenswrapper[4636]: I1003 14:34:09.338873 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:34:09 crc kubenswrapper[4636]: I1003 14:34:09.388430 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:34:09 crc kubenswrapper[4636]: I1003 14:34:09.676226 4636 scope.go:117] "RemoveContainer" containerID="5a713a957138047512d15df01d3093a25f5a18668f9494292c1a3c8e43d40cea" Oct 03 14:34:10 crc kubenswrapper[4636]: I1003 14:34:10.160924 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9qkpd"] Oct 03 14:34:11 crc kubenswrapper[4636]: I1003 14:34:11.244719 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9qkpd" podUID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerName="registry-server" containerID="cri-o://f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6" gracePeriod=2 Oct 03 14:34:11 crc kubenswrapper[4636]: I1003 14:34:11.710545 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:34:11 crc kubenswrapper[4636]: I1003 14:34:11.879639 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp5sh\" (UniqueName: \"kubernetes.io/projected/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-kube-api-access-cp5sh\") pod \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " Oct 03 14:34:11 crc kubenswrapper[4636]: I1003 14:34:11.880040 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-catalog-content\") pod \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " Oct 03 14:34:11 crc kubenswrapper[4636]: I1003 14:34:11.880074 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-utilities\") pod \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\" (UID: \"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4\") " Oct 03 14:34:11 crc kubenswrapper[4636]: I1003 14:34:11.881406 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-utilities" (OuterVolumeSpecName: "utilities") pod "2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" (UID: "2cf500fb-1994-4cf9-9288-13f2eb0c2bb4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:34:11 crc kubenswrapper[4636]: I1003 14:34:11.885466 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-kube-api-access-cp5sh" (OuterVolumeSpecName: "kube-api-access-cp5sh") pod "2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" (UID: "2cf500fb-1994-4cf9-9288-13f2eb0c2bb4"). InnerVolumeSpecName "kube-api-access-cp5sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:34:11 crc kubenswrapper[4636]: I1003 14:34:11.974018 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" (UID: "2cf500fb-1994-4cf9-9288-13f2eb0c2bb4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:34:11 crc kubenswrapper[4636]: I1003 14:34:11.981973 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp5sh\" (UniqueName: \"kubernetes.io/projected/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-kube-api-access-cp5sh\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:11 crc kubenswrapper[4636]: I1003 14:34:11.982003 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:11 crc kubenswrapper[4636]: I1003 14:34:11.982013 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.254867 4636 generic.go:334] "Generic (PLEG): container finished" podID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerID="f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6" exitCode=0 Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.254918 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qkpd" event={"ID":"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4","Type":"ContainerDied","Data":"f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6"} Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.254973 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qkpd" event={"ID":"2cf500fb-1994-4cf9-9288-13f2eb0c2bb4","Type":"ContainerDied","Data":"f3727504eb3e902d6037f0083460fd41539282788668e09f84edff61c5315584"} Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.254994 4636 scope.go:117] "RemoveContainer" containerID="f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6" Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.255997 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qkpd" Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.296038 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9qkpd"] Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.297560 4636 scope.go:117] "RemoveContainer" containerID="1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab" Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.306372 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9qkpd"] Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.324817 4636 scope.go:117] "RemoveContainer" containerID="832343ca546e8f26f3e696e93c4d92644a3dd497a084c5b7d66ebc8fe7c27eba" Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.368506 4636 scope.go:117] "RemoveContainer" containerID="f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6" Oct 03 14:34:12 crc kubenswrapper[4636]: E1003 14:34:12.368989 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6\": container with ID starting with f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6 not found: ID does not exist" containerID="f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6" Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.369042 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6"} err="failed to get container status \"f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6\": rpc error: code = NotFound desc = could not find container \"f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6\": container with ID starting with f0504a1c1211bc18bf4aacddeb0aca9a4a5a81b44c879f2190cefb90750c5cd6 not found: ID does not exist" Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.369072 4636 scope.go:117] "RemoveContainer" containerID="1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab" Oct 03 14:34:12 crc kubenswrapper[4636]: E1003 14:34:12.369476 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab\": container with ID starting with 1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab not found: ID does not exist" containerID="1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab" Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.369513 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab"} err="failed to get container status \"1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab\": rpc error: code = NotFound desc = could not find container \"1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab\": container with ID starting with 1d0578f2a1966dc5c06d379ec230923da915b653e784e25f98f0112b2dd719ab not found: ID does not exist" Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.369540 4636 scope.go:117] "RemoveContainer" containerID="832343ca546e8f26f3e696e93c4d92644a3dd497a084c5b7d66ebc8fe7c27eba" Oct 03 14:34:12 crc kubenswrapper[4636]: E1003 14:34:12.369807 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832343ca546e8f26f3e696e93c4d92644a3dd497a084c5b7d66ebc8fe7c27eba\": container with ID starting with 832343ca546e8f26f3e696e93c4d92644a3dd497a084c5b7d66ebc8fe7c27eba not found: ID does not exist" containerID="832343ca546e8f26f3e696e93c4d92644a3dd497a084c5b7d66ebc8fe7c27eba" Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.369856 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832343ca546e8f26f3e696e93c4d92644a3dd497a084c5b7d66ebc8fe7c27eba"} err="failed to get container status \"832343ca546e8f26f3e696e93c4d92644a3dd497a084c5b7d66ebc8fe7c27eba\": rpc error: code = NotFound desc = could not find container \"832343ca546e8f26f3e696e93c4d92644a3dd497a084c5b7d66ebc8fe7c27eba\": container with ID starting with 832343ca546e8f26f3e696e93c4d92644a3dd497a084c5b7d66ebc8fe7c27eba not found: ID does not exist" Oct 03 14:34:12 crc kubenswrapper[4636]: I1003 14:34:12.808931 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" path="/var/lib/kubelet/pods/2cf500fb-1994-4cf9-9288-13f2eb0c2bb4/volumes" Oct 03 14:34:13 crc kubenswrapper[4636]: I1003 14:34:13.267223 4636 generic.go:334] "Generic (PLEG): container finished" podID="baf6dabc-cac4-4e7c-9101-dcd5cfe39647" containerID="63773c8cc2bb3065ddc2241cf4b2e9dba94f3970ab05f14fa1d3762239c93224" exitCode=0 Oct 03 14:34:13 crc kubenswrapper[4636]: I1003 14:34:13.267266 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" event={"ID":"baf6dabc-cac4-4e7c-9101-dcd5cfe39647","Type":"ContainerDied","Data":"63773c8cc2bb3065ddc2241cf4b2e9dba94f3970ab05f14fa1d3762239c93224"} Oct 03 14:34:14 crc kubenswrapper[4636]: I1003 14:34:14.636071 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:14 crc kubenswrapper[4636]: I1003 14:34:14.829217 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-ssh-key\") pod \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " Oct 03 14:34:14 crc kubenswrapper[4636]: I1003 14:34:14.829284 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cm8p\" (UniqueName: \"kubernetes.io/projected/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-kube-api-access-4cm8p\") pod \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " Oct 03 14:34:14 crc kubenswrapper[4636]: I1003 14:34:14.829350 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-inventory\") pod \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\" (UID: \"baf6dabc-cac4-4e7c-9101-dcd5cfe39647\") " Oct 03 14:34:14 crc kubenswrapper[4636]: I1003 14:34:14.834622 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-kube-api-access-4cm8p" (OuterVolumeSpecName: "kube-api-access-4cm8p") pod "baf6dabc-cac4-4e7c-9101-dcd5cfe39647" (UID: "baf6dabc-cac4-4e7c-9101-dcd5cfe39647"). InnerVolumeSpecName "kube-api-access-4cm8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:34:14 crc kubenswrapper[4636]: I1003 14:34:14.854669 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-inventory" (OuterVolumeSpecName: "inventory") pod "baf6dabc-cac4-4e7c-9101-dcd5cfe39647" (UID: "baf6dabc-cac4-4e7c-9101-dcd5cfe39647"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:14 crc kubenswrapper[4636]: I1003 14:34:14.855776 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "baf6dabc-cac4-4e7c-9101-dcd5cfe39647" (UID: "baf6dabc-cac4-4e7c-9101-dcd5cfe39647"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:14 crc kubenswrapper[4636]: I1003 14:34:14.934507 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:14 crc kubenswrapper[4636]: I1003 14:34:14.934535 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cm8p\" (UniqueName: \"kubernetes.io/projected/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-kube-api-access-4cm8p\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:14 crc kubenswrapper[4636]: I1003 14:34:14.934546 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baf6dabc-cac4-4e7c-9101-dcd5cfe39647-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.281890 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" event={"ID":"baf6dabc-cac4-4e7c-9101-dcd5cfe39647","Type":"ContainerDied","Data":"fde8071f09f530c3e34cf2a4d3aafd2799572199eddc2684ba0f31cc1e25310c"} Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.282208 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fde8071f09f530c3e34cf2a4d3aafd2799572199eddc2684ba0f31cc1e25310c" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.282256 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2jt64" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.423492 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw"] Oct 03 14:34:15 crc kubenswrapper[4636]: E1003 14:34:15.424169 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf6dabc-cac4-4e7c-9101-dcd5cfe39647" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.424248 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf6dabc-cac4-4e7c-9101-dcd5cfe39647" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 14:34:15 crc kubenswrapper[4636]: E1003 14:34:15.424364 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerName="extract-utilities" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.424434 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerName="extract-utilities" Oct 03 14:34:15 crc kubenswrapper[4636]: E1003 14:34:15.424589 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerName="extract-content" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.424654 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerName="extract-content" Oct 03 14:34:15 crc kubenswrapper[4636]: E1003 14:34:15.424714 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerName="registry-server" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.424784 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerName="registry-server" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.425060 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf6dabc-cac4-4e7c-9101-dcd5cfe39647" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.425172 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf500fb-1994-4cf9-9288-13f2eb0c2bb4" containerName="registry-server" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.425836 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.430715 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.430952 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.431062 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.431189 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.436080 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw"] Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.443881 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mlj7\" (UniqueName: \"kubernetes.io/projected/52b89ba7-3476-42ae-aa47-fb7a38732669-kube-api-access-9mlj7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r9sdw\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.443955 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r9sdw\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.443990 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r9sdw\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.545844 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r9sdw\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.545903 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r9sdw\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.546783 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mlj7\" (UniqueName: \"kubernetes.io/projected/52b89ba7-3476-42ae-aa47-fb7a38732669-kube-api-access-9mlj7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r9sdw\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.560302 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r9sdw\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.560418 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r9sdw\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.572281 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mlj7\" (UniqueName: \"kubernetes.io/projected/52b89ba7-3476-42ae-aa47-fb7a38732669-kube-api-access-9mlj7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r9sdw\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:15 crc kubenswrapper[4636]: I1003 14:34:15.743475 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:16 crc kubenswrapper[4636]: I1003 14:34:16.087872 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw"] Oct 03 14:34:16 crc kubenswrapper[4636]: I1003 14:34:16.291294 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" event={"ID":"52b89ba7-3476-42ae-aa47-fb7a38732669","Type":"ContainerStarted","Data":"b68de1fdd8e4a266be88a98c9e1301e1886277e982cd2bcfbc94e43878dc4cf2"} Oct 03 14:34:17 crc kubenswrapper[4636]: I1003 14:34:17.326418 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" event={"ID":"52b89ba7-3476-42ae-aa47-fb7a38732669","Type":"ContainerStarted","Data":"7f1f9a11f3054dedd504ba17d68e6a1089c8dc3f6af227206d88402df218c6ef"} Oct 03 14:34:17 crc kubenswrapper[4636]: I1003 14:34:17.369033 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" podStartSLOduration=1.837291279 podStartE2EDuration="2.369008411s" podCreationTimestamp="2025-10-03 14:34:15 +0000 UTC" firstStartedPulling="2025-10-03 14:34:16.098882403 +0000 UTC m=+2005.957608650" lastFinishedPulling="2025-10-03 14:34:16.630599535 +0000 UTC m=+2006.489325782" observedRunningTime="2025-10-03 14:34:17.358235364 +0000 UTC m=+2007.216961611" watchObservedRunningTime="2025-10-03 14:34:17.369008411 +0000 UTC m=+2007.227734658" Oct 03 14:34:39 crc kubenswrapper[4636]: I1003 14:34:39.163357 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:34:39 crc kubenswrapper[4636]: I1003 14:34:39.163802 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.575722 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x57lh"] Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.578401 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.588159 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x57lh"] Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.722031 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-utilities\") pod \"redhat-marketplace-x57lh\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.722365 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwhfb\" (UniqueName: \"kubernetes.io/projected/64306433-d60a-4f75-9376-41a83875c1c3-kube-api-access-bwhfb\") pod \"redhat-marketplace-x57lh\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.722470 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-catalog-content\") pod \"redhat-marketplace-x57lh\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.825012 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwhfb\" (UniqueName: \"kubernetes.io/projected/64306433-d60a-4f75-9376-41a83875c1c3-kube-api-access-bwhfb\") pod \"redhat-marketplace-x57lh\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.825075 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-catalog-content\") pod \"redhat-marketplace-x57lh\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.825204 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-utilities\") pod \"redhat-marketplace-x57lh\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.825727 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-catalog-content\") pod \"redhat-marketplace-x57lh\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.826021 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-utilities\") pod \"redhat-marketplace-x57lh\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.846677 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwhfb\" (UniqueName: \"kubernetes.io/projected/64306433-d60a-4f75-9376-41a83875c1c3-kube-api-access-bwhfb\") pod \"redhat-marketplace-x57lh\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:34:55 crc kubenswrapper[4636]: I1003 14:34:55.897009 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:34:56 crc kubenswrapper[4636]: I1003 14:34:56.388312 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x57lh"] Oct 03 14:34:56 crc kubenswrapper[4636]: W1003 14:34:56.398302 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64306433_d60a_4f75_9376_41a83875c1c3.slice/crio-7200321761bdf065df6117136484624fa4f619ec473afd13019a8b0097b58e19 WatchSource:0}: Error finding container 7200321761bdf065df6117136484624fa4f619ec473afd13019a8b0097b58e19: Status 404 returned error can't find the container with id 7200321761bdf065df6117136484624fa4f619ec473afd13019a8b0097b58e19 Oct 03 14:34:56 crc kubenswrapper[4636]: I1003 14:34:56.671275 4636 generic.go:334] "Generic (PLEG): container finished" podID="64306433-d60a-4f75-9376-41a83875c1c3" containerID="10cd76f6f2dfb186a58561344b209a034e6bf8c1203fc9e09456d66f22e550aa" exitCode=0 Oct 03 14:34:56 crc kubenswrapper[4636]: I1003 14:34:56.671318 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x57lh" event={"ID":"64306433-d60a-4f75-9376-41a83875c1c3","Type":"ContainerDied","Data":"10cd76f6f2dfb186a58561344b209a034e6bf8c1203fc9e09456d66f22e550aa"} Oct 03 14:34:56 crc kubenswrapper[4636]: I1003 14:34:56.671588 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x57lh" event={"ID":"64306433-d60a-4f75-9376-41a83875c1c3","Type":"ContainerStarted","Data":"7200321761bdf065df6117136484624fa4f619ec473afd13019a8b0097b58e19"} Oct 03 14:34:56 crc kubenswrapper[4636]: I1003 14:34:56.672950 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:34:57 crc kubenswrapper[4636]: I1003 14:34:57.680460 4636 generic.go:334] "Generic (PLEG): container finished" podID="52b89ba7-3476-42ae-aa47-fb7a38732669" containerID="7f1f9a11f3054dedd504ba17d68e6a1089c8dc3f6af227206d88402df218c6ef" exitCode=0 Oct 03 14:34:57 crc kubenswrapper[4636]: I1003 14:34:57.680551 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" event={"ID":"52b89ba7-3476-42ae-aa47-fb7a38732669","Type":"ContainerDied","Data":"7f1f9a11f3054dedd504ba17d68e6a1089c8dc3f6af227206d88402df218c6ef"} Oct 03 14:34:57 crc kubenswrapper[4636]: I1003 14:34:57.683623 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x57lh" event={"ID":"64306433-d60a-4f75-9376-41a83875c1c3","Type":"ContainerStarted","Data":"09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f"} Oct 03 14:34:58 crc kubenswrapper[4636]: I1003 14:34:58.693564 4636 generic.go:334] "Generic (PLEG): container finished" podID="64306433-d60a-4f75-9376-41a83875c1c3" containerID="09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f" exitCode=0 Oct 03 14:34:58 crc kubenswrapper[4636]: I1003 14:34:58.693638 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x57lh" event={"ID":"64306433-d60a-4f75-9376-41a83875c1c3","Type":"ContainerDied","Data":"09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f"} Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.101787 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.285751 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-ssh-key\") pod \"52b89ba7-3476-42ae-aa47-fb7a38732669\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.285891 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mlj7\" (UniqueName: \"kubernetes.io/projected/52b89ba7-3476-42ae-aa47-fb7a38732669-kube-api-access-9mlj7\") pod \"52b89ba7-3476-42ae-aa47-fb7a38732669\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.285966 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-inventory\") pod \"52b89ba7-3476-42ae-aa47-fb7a38732669\" (UID: \"52b89ba7-3476-42ae-aa47-fb7a38732669\") " Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.304241 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b89ba7-3476-42ae-aa47-fb7a38732669-kube-api-access-9mlj7" (OuterVolumeSpecName: "kube-api-access-9mlj7") pod "52b89ba7-3476-42ae-aa47-fb7a38732669" (UID: "52b89ba7-3476-42ae-aa47-fb7a38732669"). InnerVolumeSpecName "kube-api-access-9mlj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.311931 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-inventory" (OuterVolumeSpecName: "inventory") pod "52b89ba7-3476-42ae-aa47-fb7a38732669" (UID: "52b89ba7-3476-42ae-aa47-fb7a38732669"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.318388 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "52b89ba7-3476-42ae-aa47-fb7a38732669" (UID: "52b89ba7-3476-42ae-aa47-fb7a38732669"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.388740 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.389010 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mlj7\" (UniqueName: \"kubernetes.io/projected/52b89ba7-3476-42ae-aa47-fb7a38732669-kube-api-access-9mlj7\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.389022 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52b89ba7-3476-42ae-aa47-fb7a38732669-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.560060 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nx8cj"] Oct 03 14:34:59 crc kubenswrapper[4636]: E1003 14:34:59.560515 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b89ba7-3476-42ae-aa47-fb7a38732669" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.560543 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b89ba7-3476-42ae-aa47-fb7a38732669" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.560713 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b89ba7-3476-42ae-aa47-fb7a38732669" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.562369 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.573182 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nx8cj"] Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.701492 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-utilities\") pod \"certified-operators-nx8cj\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.701650 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-catalog-content\") pod \"certified-operators-nx8cj\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.701734 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwpt\" (UniqueName: \"kubernetes.io/projected/83939a20-33aa-47fd-b4db-1c9e917cd74f-kube-api-access-mlwpt\") pod \"certified-operators-nx8cj\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.705373 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x57lh" event={"ID":"64306433-d60a-4f75-9376-41a83875c1c3","Type":"ContainerStarted","Data":"bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054"} Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.708257 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" event={"ID":"52b89ba7-3476-42ae-aa47-fb7a38732669","Type":"ContainerDied","Data":"b68de1fdd8e4a266be88a98c9e1301e1886277e982cd2bcfbc94e43878dc4cf2"} Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.708314 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b68de1fdd8e4a266be88a98c9e1301e1886277e982cd2bcfbc94e43878dc4cf2" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.708416 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r9sdw" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.735427 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x57lh" podStartSLOduration=2.126908391 podStartE2EDuration="4.735402987s" podCreationTimestamp="2025-10-03 14:34:55 +0000 UTC" firstStartedPulling="2025-10-03 14:34:56.672718237 +0000 UTC m=+2046.531444484" lastFinishedPulling="2025-10-03 14:34:59.281212833 +0000 UTC m=+2049.139939080" observedRunningTime="2025-10-03 14:34:59.726337103 +0000 UTC m=+2049.585063340" watchObservedRunningTime="2025-10-03 14:34:59.735402987 +0000 UTC m=+2049.594129244" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.803560 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-utilities\") pod \"certified-operators-nx8cj\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.803820 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-catalog-content\") pod \"certified-operators-nx8cj\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.803996 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwpt\" (UniqueName: \"kubernetes.io/projected/83939a20-33aa-47fd-b4db-1c9e917cd74f-kube-api-access-mlwpt\") pod \"certified-operators-nx8cj\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.804407 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-utilities\") pod \"certified-operators-nx8cj\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.804804 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-catalog-content\") pod \"certified-operators-nx8cj\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.825627 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwpt\" (UniqueName: \"kubernetes.io/projected/83939a20-33aa-47fd-b4db-1c9e917cd74f-kube-api-access-mlwpt\") pod \"certified-operators-nx8cj\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.857203 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r"] Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.858401 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.865686 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.865764 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.878670 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.878832 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r"] Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.878675 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.915510 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.929019 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5mn\" (UniqueName: \"kubernetes.io/projected/ca12d2cd-3187-4910-9e28-2f977be4bcf8-kube-api-access-pg5mn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kn84r\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.929153 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kn84r\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:34:59 crc kubenswrapper[4636]: I1003 14:34:59.929272 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kn84r\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.030558 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kn84r\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.030963 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5mn\" (UniqueName: \"kubernetes.io/projected/ca12d2cd-3187-4910-9e28-2f977be4bcf8-kube-api-access-pg5mn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kn84r\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.031007 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kn84r\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.044312 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kn84r\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.050953 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kn84r\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.075846 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5mn\" (UniqueName: \"kubernetes.io/projected/ca12d2cd-3187-4910-9e28-2f977be4bcf8-kube-api-access-pg5mn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kn84r\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.265766 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.312542 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nx8cj"] Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.719151 4636 generic.go:334] "Generic (PLEG): container finished" podID="83939a20-33aa-47fd-b4db-1c9e917cd74f" containerID="689f721cde4bd00fb1042aae245992696be95c4d343b8759c561b9d7e9131709" exitCode=0 Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.719231 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx8cj" event={"ID":"83939a20-33aa-47fd-b4db-1c9e917cd74f","Type":"ContainerDied","Data":"689f721cde4bd00fb1042aae245992696be95c4d343b8759c561b9d7e9131709"} Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.719726 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx8cj" event={"ID":"83939a20-33aa-47fd-b4db-1c9e917cd74f","Type":"ContainerStarted","Data":"17c40af95f61cb5aa6df1c7efa2bba94c6fab263b237bb6cfa76231e7d7ae170"} Oct 03 14:35:00 crc kubenswrapper[4636]: I1003 14:35:00.773693 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r"] Oct 03 14:35:00 crc kubenswrapper[4636]: W1003 14:35:00.776970 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca12d2cd_3187_4910_9e28_2f977be4bcf8.slice/crio-f1218ed2fbc75b13587bcdff36f3da2547b5ffa42ca2f2e56749828a79037b9d WatchSource:0}: Error finding container f1218ed2fbc75b13587bcdff36f3da2547b5ffa42ca2f2e56749828a79037b9d: Status 404 returned error can't find the container with id f1218ed2fbc75b13587bcdff36f3da2547b5ffa42ca2f2e56749828a79037b9d Oct 03 14:35:01 crc kubenswrapper[4636]: I1003 14:35:01.741129 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" event={"ID":"ca12d2cd-3187-4910-9e28-2f977be4bcf8","Type":"ContainerStarted","Data":"8a64fc8794e0bb84d8241de604f7458a9939307bbe9f651b8eea00e511244c63"} Oct 03 14:35:01 crc kubenswrapper[4636]: I1003 14:35:01.741480 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" event={"ID":"ca12d2cd-3187-4910-9e28-2f977be4bcf8","Type":"ContainerStarted","Data":"f1218ed2fbc75b13587bcdff36f3da2547b5ffa42ca2f2e56749828a79037b9d"} Oct 03 14:35:01 crc kubenswrapper[4636]: I1003 14:35:01.768188 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" podStartSLOduration=2.049019095 podStartE2EDuration="2.768156786s" podCreationTimestamp="2025-10-03 14:34:59 +0000 UTC" firstStartedPulling="2025-10-03 14:35:00.779816358 +0000 UTC m=+2050.638542605" lastFinishedPulling="2025-10-03 14:35:01.498954049 +0000 UTC m=+2051.357680296" observedRunningTime="2025-10-03 14:35:01.758674201 +0000 UTC m=+2051.617400448" watchObservedRunningTime="2025-10-03 14:35:01.768156786 +0000 UTC m=+2051.626883053" Oct 03 14:35:02 crc kubenswrapper[4636]: I1003 14:35:02.763080 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx8cj" event={"ID":"83939a20-33aa-47fd-b4db-1c9e917cd74f","Type":"ContainerStarted","Data":"8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6"} Oct 03 14:35:04 crc kubenswrapper[4636]: I1003 14:35:04.779810 4636 generic.go:334] "Generic (PLEG): container finished" podID="83939a20-33aa-47fd-b4db-1c9e917cd74f" containerID="8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6" exitCode=0 Oct 03 14:35:04 crc kubenswrapper[4636]: I1003 14:35:04.779886 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx8cj" event={"ID":"83939a20-33aa-47fd-b4db-1c9e917cd74f","Type":"ContainerDied","Data":"8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6"} Oct 03 14:35:05 crc kubenswrapper[4636]: I1003 14:35:05.790307 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx8cj" event={"ID":"83939a20-33aa-47fd-b4db-1c9e917cd74f","Type":"ContainerStarted","Data":"87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b"} Oct 03 14:35:05 crc kubenswrapper[4636]: I1003 14:35:05.810438 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nx8cj" podStartSLOduration=2.369831702 podStartE2EDuration="6.810416016s" podCreationTimestamp="2025-10-03 14:34:59 +0000 UTC" firstStartedPulling="2025-10-03 14:35:00.723576459 +0000 UTC m=+2050.582302706" lastFinishedPulling="2025-10-03 14:35:05.164160773 +0000 UTC m=+2055.022887020" observedRunningTime="2025-10-03 14:35:05.806504835 +0000 UTC m=+2055.665231092" watchObservedRunningTime="2025-10-03 14:35:05.810416016 +0000 UTC m=+2055.669142263" Oct 03 14:35:05 crc kubenswrapper[4636]: I1003 14:35:05.898212 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:35:05 crc kubenswrapper[4636]: I1003 14:35:05.898270 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:35:05 crc kubenswrapper[4636]: I1003 14:35:05.946736 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:35:06 crc kubenswrapper[4636]: I1003 14:35:06.845840 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:35:07 crc kubenswrapper[4636]: I1003 14:35:07.560658 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x57lh"] Oct 03 14:35:08 crc kubenswrapper[4636]: I1003 14:35:08.814935 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x57lh" podUID="64306433-d60a-4f75-9376-41a83875c1c3" containerName="registry-server" containerID="cri-o://bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054" gracePeriod=2 Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.163059 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.163348 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.163400 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.163924 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6bf347f17c6a57808711c5d59e3068eaceb6db558930f7501a3d5cd85b4d3b8e"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.163987 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://6bf347f17c6a57808711c5d59e3068eaceb6db558930f7501a3d5cd85b4d3b8e" gracePeriod=600 Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.253153 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.303773 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-utilities\") pod \"64306433-d60a-4f75-9376-41a83875c1c3\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.303904 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwhfb\" (UniqueName: \"kubernetes.io/projected/64306433-d60a-4f75-9376-41a83875c1c3-kube-api-access-bwhfb\") pod \"64306433-d60a-4f75-9376-41a83875c1c3\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.304029 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-catalog-content\") pod \"64306433-d60a-4f75-9376-41a83875c1c3\" (UID: \"64306433-d60a-4f75-9376-41a83875c1c3\") " Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.304723 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-utilities" (OuterVolumeSpecName: "utilities") pod "64306433-d60a-4f75-9376-41a83875c1c3" (UID: "64306433-d60a-4f75-9376-41a83875c1c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.310066 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64306433-d60a-4f75-9376-41a83875c1c3-kube-api-access-bwhfb" (OuterVolumeSpecName: "kube-api-access-bwhfb") pod "64306433-d60a-4f75-9376-41a83875c1c3" (UID: "64306433-d60a-4f75-9376-41a83875c1c3"). InnerVolumeSpecName "kube-api-access-bwhfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.316598 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64306433-d60a-4f75-9376-41a83875c1c3" (UID: "64306433-d60a-4f75-9376-41a83875c1c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.405988 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.406280 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwhfb\" (UniqueName: \"kubernetes.io/projected/64306433-d60a-4f75-9376-41a83875c1c3-kube-api-access-bwhfb\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.406296 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64306433-d60a-4f75-9376-41a83875c1c3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.826290 4636 generic.go:334] "Generic (PLEG): container finished" podID="64306433-d60a-4f75-9376-41a83875c1c3" containerID="bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054" exitCode=0 Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.826366 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x57lh" event={"ID":"64306433-d60a-4f75-9376-41a83875c1c3","Type":"ContainerDied","Data":"bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054"} Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.826407 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x57lh" event={"ID":"64306433-d60a-4f75-9376-41a83875c1c3","Type":"ContainerDied","Data":"7200321761bdf065df6117136484624fa4f619ec473afd13019a8b0097b58e19"} Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.826429 4636 scope.go:117] "RemoveContainer" containerID="bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.828748 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x57lh" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.829718 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="6bf347f17c6a57808711c5d59e3068eaceb6db558930f7501a3d5cd85b4d3b8e" exitCode=0 Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.829753 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"6bf347f17c6a57808711c5d59e3068eaceb6db558930f7501a3d5cd85b4d3b8e"} Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.829772 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab"} Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.847135 4636 scope.go:117] "RemoveContainer" containerID="09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.896273 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x57lh"] Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.902960 4636 scope.go:117] "RemoveContainer" containerID="10cd76f6f2dfb186a58561344b209a034e6bf8c1203fc9e09456d66f22e550aa" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.905217 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x57lh"] Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.918510 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.918804 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.932776 4636 scope.go:117] "RemoveContainer" containerID="bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054" Oct 03 14:35:09 crc kubenswrapper[4636]: E1003 14:35:09.933678 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054\": container with ID starting with bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054 not found: ID does not exist" containerID="bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.933903 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054"} err="failed to get container status \"bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054\": rpc error: code = NotFound desc = could not find container \"bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054\": container with ID starting with bbea1c1e07553c963e47f95781616d2d34860639466b23f8ab1639d11398c054 not found: ID does not exist" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.934001 4636 scope.go:117] "RemoveContainer" containerID="09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f" Oct 03 14:35:09 crc kubenswrapper[4636]: E1003 14:35:09.934376 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f\": container with ID starting with 09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f not found: ID does not exist" containerID="09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.934493 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f"} err="failed to get container status \"09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f\": rpc error: code = NotFound desc = could not find container \"09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f\": container with ID starting with 09c62bd1ad3ce16d2474cfba4b40224b1bba481d734d9dcf978203065b74985f not found: ID does not exist" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.934576 4636 scope.go:117] "RemoveContainer" containerID="10cd76f6f2dfb186a58561344b209a034e6bf8c1203fc9e09456d66f22e550aa" Oct 03 14:35:09 crc kubenswrapper[4636]: E1003 14:35:09.934934 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10cd76f6f2dfb186a58561344b209a034e6bf8c1203fc9e09456d66f22e550aa\": container with ID starting with 10cd76f6f2dfb186a58561344b209a034e6bf8c1203fc9e09456d66f22e550aa not found: ID does not exist" containerID="10cd76f6f2dfb186a58561344b209a034e6bf8c1203fc9e09456d66f22e550aa" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.934977 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cd76f6f2dfb186a58561344b209a034e6bf8c1203fc9e09456d66f22e550aa"} err="failed to get container status \"10cd76f6f2dfb186a58561344b209a034e6bf8c1203fc9e09456d66f22e550aa\": rpc error: code = NotFound desc = could not find container \"10cd76f6f2dfb186a58561344b209a034e6bf8c1203fc9e09456d66f22e550aa\": container with ID starting with 10cd76f6f2dfb186a58561344b209a034e6bf8c1203fc9e09456d66f22e550aa not found: ID does not exist" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.935013 4636 scope.go:117] "RemoveContainer" containerID="4fb7759862634a27bd9e0c4450a56e8a8fe07db1c517fd1a31725161f9a78186" Oct 03 14:35:09 crc kubenswrapper[4636]: I1003 14:35:09.978769 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:35:10 crc kubenswrapper[4636]: I1003 14:35:10.804340 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64306433-d60a-4f75-9376-41a83875c1c3" path="/var/lib/kubelet/pods/64306433-d60a-4f75-9376-41a83875c1c3/volumes" Oct 03 14:35:10 crc kubenswrapper[4636]: I1003 14:35:10.893488 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:35:12 crc kubenswrapper[4636]: I1003 14:35:12.557064 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nx8cj"] Oct 03 14:35:13 crc kubenswrapper[4636]: I1003 14:35:13.863550 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nx8cj" podUID="83939a20-33aa-47fd-b4db-1c9e917cd74f" containerName="registry-server" containerID="cri-o://87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b" gracePeriod=2 Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.348689 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.408156 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-utilities\") pod \"83939a20-33aa-47fd-b4db-1c9e917cd74f\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.408478 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlwpt\" (UniqueName: \"kubernetes.io/projected/83939a20-33aa-47fd-b4db-1c9e917cd74f-kube-api-access-mlwpt\") pod \"83939a20-33aa-47fd-b4db-1c9e917cd74f\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.408706 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-catalog-content\") pod \"83939a20-33aa-47fd-b4db-1c9e917cd74f\" (UID: \"83939a20-33aa-47fd-b4db-1c9e917cd74f\") " Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.409275 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-utilities" (OuterVolumeSpecName: "utilities") pod "83939a20-33aa-47fd-b4db-1c9e917cd74f" (UID: "83939a20-33aa-47fd-b4db-1c9e917cd74f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.409732 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.420083 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83939a20-33aa-47fd-b4db-1c9e917cd74f-kube-api-access-mlwpt" (OuterVolumeSpecName: "kube-api-access-mlwpt") pod "83939a20-33aa-47fd-b4db-1c9e917cd74f" (UID: "83939a20-33aa-47fd-b4db-1c9e917cd74f"). InnerVolumeSpecName "kube-api-access-mlwpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.458427 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83939a20-33aa-47fd-b4db-1c9e917cd74f" (UID: "83939a20-33aa-47fd-b4db-1c9e917cd74f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.511615 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlwpt\" (UniqueName: \"kubernetes.io/projected/83939a20-33aa-47fd-b4db-1c9e917cd74f-kube-api-access-mlwpt\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.511651 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83939a20-33aa-47fd-b4db-1c9e917cd74f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.873717 4636 generic.go:334] "Generic (PLEG): container finished" podID="83939a20-33aa-47fd-b4db-1c9e917cd74f" containerID="87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b" exitCode=0 Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.873778 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx8cj" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.873795 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx8cj" event={"ID":"83939a20-33aa-47fd-b4db-1c9e917cd74f","Type":"ContainerDied","Data":"87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b"} Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.874147 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx8cj" event={"ID":"83939a20-33aa-47fd-b4db-1c9e917cd74f","Type":"ContainerDied","Data":"17c40af95f61cb5aa6df1c7efa2bba94c6fab263b237bb6cfa76231e7d7ae170"} Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.874175 4636 scope.go:117] "RemoveContainer" containerID="87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.901018 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nx8cj"] Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.901952 4636 scope.go:117] "RemoveContainer" containerID="8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.909773 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nx8cj"] Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.930833 4636 scope.go:117] "RemoveContainer" containerID="689f721cde4bd00fb1042aae245992696be95c4d343b8759c561b9d7e9131709" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.959725 4636 scope.go:117] "RemoveContainer" containerID="87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b" Oct 03 14:35:14 crc kubenswrapper[4636]: E1003 14:35:14.960165 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b\": container with ID starting with 87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b not found: ID does not exist" containerID="87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.960204 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b"} err="failed to get container status \"87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b\": rpc error: code = NotFound desc = could not find container \"87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b\": container with ID starting with 87cf3e6c1c4fa10cd11f857e9e13efa3994a90a088327413a7f51d22957f534b not found: ID does not exist" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.960233 4636 scope.go:117] "RemoveContainer" containerID="8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6" Oct 03 14:35:14 crc kubenswrapper[4636]: E1003 14:35:14.960572 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6\": container with ID starting with 8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6 not found: ID does not exist" containerID="8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.960592 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6"} err="failed to get container status \"8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6\": rpc error: code = NotFound desc = could not find container \"8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6\": container with ID starting with 8f834bd59d82bf8384f8ae4c33de3d808cd64899d7f72d07cd8602622b33dcb6 not found: ID does not exist" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.960609 4636 scope.go:117] "RemoveContainer" containerID="689f721cde4bd00fb1042aae245992696be95c4d343b8759c561b9d7e9131709" Oct 03 14:35:14 crc kubenswrapper[4636]: E1003 14:35:14.960983 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"689f721cde4bd00fb1042aae245992696be95c4d343b8759c561b9d7e9131709\": container with ID starting with 689f721cde4bd00fb1042aae245992696be95c4d343b8759c561b9d7e9131709 not found: ID does not exist" containerID="689f721cde4bd00fb1042aae245992696be95c4d343b8759c561b9d7e9131709" Oct 03 14:35:14 crc kubenswrapper[4636]: I1003 14:35:14.961001 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689f721cde4bd00fb1042aae245992696be95c4d343b8759c561b9d7e9131709"} err="failed to get container status \"689f721cde4bd00fb1042aae245992696be95c4d343b8759c561b9d7e9131709\": rpc error: code = NotFound desc = could not find container \"689f721cde4bd00fb1042aae245992696be95c4d343b8759c561b9d7e9131709\": container with ID starting with 689f721cde4bd00fb1042aae245992696be95c4d343b8759c561b9d7e9131709 not found: ID does not exist" Oct 03 14:35:16 crc kubenswrapper[4636]: I1003 14:35:16.803464 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83939a20-33aa-47fd-b4db-1c9e917cd74f" path="/var/lib/kubelet/pods/83939a20-33aa-47fd-b4db-1c9e917cd74f/volumes" Oct 03 14:35:58 crc kubenswrapper[4636]: I1003 14:35:58.260546 4636 generic.go:334] "Generic (PLEG): container finished" podID="ca12d2cd-3187-4910-9e28-2f977be4bcf8" containerID="8a64fc8794e0bb84d8241de604f7458a9939307bbe9f651b8eea00e511244c63" exitCode=2 Oct 03 14:35:58 crc kubenswrapper[4636]: I1003 14:35:58.260653 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" event={"ID":"ca12d2cd-3187-4910-9e28-2f977be4bcf8","Type":"ContainerDied","Data":"8a64fc8794e0bb84d8241de604f7458a9939307bbe9f651b8eea00e511244c63"} Oct 03 14:35:59 crc kubenswrapper[4636]: I1003 14:35:59.706174 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:35:59 crc kubenswrapper[4636]: I1003 14:35:59.829820 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-ssh-key\") pod \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " Oct 03 14:35:59 crc kubenswrapper[4636]: I1003 14:35:59.829961 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-inventory\") pod \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " Oct 03 14:35:59 crc kubenswrapper[4636]: I1003 14:35:59.830023 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg5mn\" (UniqueName: \"kubernetes.io/projected/ca12d2cd-3187-4910-9e28-2f977be4bcf8-kube-api-access-pg5mn\") pod \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\" (UID: \"ca12d2cd-3187-4910-9e28-2f977be4bcf8\") " Oct 03 14:35:59 crc kubenswrapper[4636]: I1003 14:35:59.837415 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca12d2cd-3187-4910-9e28-2f977be4bcf8-kube-api-access-pg5mn" (OuterVolumeSpecName: "kube-api-access-pg5mn") pod "ca12d2cd-3187-4910-9e28-2f977be4bcf8" (UID: "ca12d2cd-3187-4910-9e28-2f977be4bcf8"). InnerVolumeSpecName "kube-api-access-pg5mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:35:59 crc kubenswrapper[4636]: I1003 14:35:59.859024 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-inventory" (OuterVolumeSpecName: "inventory") pod "ca12d2cd-3187-4910-9e28-2f977be4bcf8" (UID: "ca12d2cd-3187-4910-9e28-2f977be4bcf8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:35:59 crc kubenswrapper[4636]: I1003 14:35:59.864872 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ca12d2cd-3187-4910-9e28-2f977be4bcf8" (UID: "ca12d2cd-3187-4910-9e28-2f977be4bcf8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:35:59 crc kubenswrapper[4636]: I1003 14:35:59.935791 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg5mn\" (UniqueName: \"kubernetes.io/projected/ca12d2cd-3187-4910-9e28-2f977be4bcf8-kube-api-access-pg5mn\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:59 crc kubenswrapper[4636]: I1003 14:35:59.935832 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:35:59 crc kubenswrapper[4636]: I1003 14:35:59.935848 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca12d2cd-3187-4910-9e28-2f977be4bcf8-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:00 crc kubenswrapper[4636]: I1003 14:36:00.300623 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" event={"ID":"ca12d2cd-3187-4910-9e28-2f977be4bcf8","Type":"ContainerDied","Data":"f1218ed2fbc75b13587bcdff36f3da2547b5ffa42ca2f2e56749828a79037b9d"} Oct 03 14:36:00 crc kubenswrapper[4636]: I1003 14:36:00.300689 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1218ed2fbc75b13587bcdff36f3da2547b5ffa42ca2f2e56749828a79037b9d" Oct 03 14:36:00 crc kubenswrapper[4636]: I1003 14:36:00.300829 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kn84r" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.037710 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk"] Oct 03 14:36:08 crc kubenswrapper[4636]: E1003 14:36:08.038761 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64306433-d60a-4f75-9376-41a83875c1c3" containerName="registry-server" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.038779 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="64306433-d60a-4f75-9376-41a83875c1c3" containerName="registry-server" Oct 03 14:36:08 crc kubenswrapper[4636]: E1003 14:36:08.038803 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64306433-d60a-4f75-9376-41a83875c1c3" containerName="extract-utilities" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.038811 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="64306433-d60a-4f75-9376-41a83875c1c3" containerName="extract-utilities" Oct 03 14:36:08 crc kubenswrapper[4636]: E1003 14:36:08.038824 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83939a20-33aa-47fd-b4db-1c9e917cd74f" containerName="extract-utilities" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.038831 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="83939a20-33aa-47fd-b4db-1c9e917cd74f" containerName="extract-utilities" Oct 03 14:36:08 crc kubenswrapper[4636]: E1003 14:36:08.038847 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64306433-d60a-4f75-9376-41a83875c1c3" containerName="extract-content" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.038854 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="64306433-d60a-4f75-9376-41a83875c1c3" containerName="extract-content" Oct 03 14:36:08 crc kubenswrapper[4636]: E1003 14:36:08.038873 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83939a20-33aa-47fd-b4db-1c9e917cd74f" containerName="registry-server" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.038881 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="83939a20-33aa-47fd-b4db-1c9e917cd74f" containerName="registry-server" Oct 03 14:36:08 crc kubenswrapper[4636]: E1003 14:36:08.038897 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83939a20-33aa-47fd-b4db-1c9e917cd74f" containerName="extract-content" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.038906 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="83939a20-33aa-47fd-b4db-1c9e917cd74f" containerName="extract-content" Oct 03 14:36:08 crc kubenswrapper[4636]: E1003 14:36:08.038930 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca12d2cd-3187-4910-9e28-2f977be4bcf8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.038939 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca12d2cd-3187-4910-9e28-2f977be4bcf8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.039186 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="64306433-d60a-4f75-9376-41a83875c1c3" containerName="registry-server" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.039208 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca12d2cd-3187-4910-9e28-2f977be4bcf8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.039221 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="83939a20-33aa-47fd-b4db-1c9e917cd74f" containerName="registry-server" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.040030 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.043914 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.044245 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.044430 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.044543 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk"] Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.046045 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.218034 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.218127 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97gl5\" (UniqueName: \"kubernetes.io/projected/e872c241-3445-4382-a7f0-1a15d6a223c2-kube-api-access-97gl5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.218173 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.320964 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.321067 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97gl5\" (UniqueName: \"kubernetes.io/projected/e872c241-3445-4382-a7f0-1a15d6a223c2-kube-api-access-97gl5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.321232 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.327238 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.328428 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.345976 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97gl5\" (UniqueName: \"kubernetes.io/projected/e872c241-3445-4382-a7f0-1a15d6a223c2-kube-api-access-97gl5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.368497 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:08 crc kubenswrapper[4636]: I1003 14:36:08.973943 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk"] Oct 03 14:36:08 crc kubenswrapper[4636]: W1003 14:36:08.980542 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode872c241_3445_4382_a7f0_1a15d6a223c2.slice/crio-5ad516c75630cb12d7ed681889e305cd7fe863fc205410e3be305490784e692c WatchSource:0}: Error finding container 5ad516c75630cb12d7ed681889e305cd7fe863fc205410e3be305490784e692c: Status 404 returned error can't find the container with id 5ad516c75630cb12d7ed681889e305cd7fe863fc205410e3be305490784e692c Oct 03 14:36:09 crc kubenswrapper[4636]: I1003 14:36:09.384956 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" event={"ID":"e872c241-3445-4382-a7f0-1a15d6a223c2","Type":"ContainerStarted","Data":"5ad516c75630cb12d7ed681889e305cd7fe863fc205410e3be305490784e692c"} Oct 03 14:36:10 crc kubenswrapper[4636]: I1003 14:36:10.394356 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" event={"ID":"e872c241-3445-4382-a7f0-1a15d6a223c2","Type":"ContainerStarted","Data":"08e366d93e6134becbd0d93f3806827cca24dc900c27d7f4af0b994c67ebe740"} Oct 03 14:36:10 crc kubenswrapper[4636]: I1003 14:36:10.420885 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" podStartSLOduration=1.8638678020000001 podStartE2EDuration="2.420864275s" podCreationTimestamp="2025-10-03 14:36:08 +0000 UTC" firstStartedPulling="2025-10-03 14:36:08.986626257 +0000 UTC m=+2118.845352504" lastFinishedPulling="2025-10-03 14:36:09.54362273 +0000 UTC m=+2119.402348977" observedRunningTime="2025-10-03 14:36:10.416568624 +0000 UTC m=+2120.275294871" watchObservedRunningTime="2025-10-03 14:36:10.420864275 +0000 UTC m=+2120.279590522" Oct 03 14:36:56 crc kubenswrapper[4636]: I1003 14:36:56.762406 4636 generic.go:334] "Generic (PLEG): container finished" podID="e872c241-3445-4382-a7f0-1a15d6a223c2" containerID="08e366d93e6134becbd0d93f3806827cca24dc900c27d7f4af0b994c67ebe740" exitCode=0 Oct 03 14:36:56 crc kubenswrapper[4636]: I1003 14:36:56.762496 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" event={"ID":"e872c241-3445-4382-a7f0-1a15d6a223c2","Type":"ContainerDied","Data":"08e366d93e6134becbd0d93f3806827cca24dc900c27d7f4af0b994c67ebe740"} Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.136528 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.247505 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-ssh-key\") pod \"e872c241-3445-4382-a7f0-1a15d6a223c2\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.247641 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-inventory\") pod \"e872c241-3445-4382-a7f0-1a15d6a223c2\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.247677 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97gl5\" (UniqueName: \"kubernetes.io/projected/e872c241-3445-4382-a7f0-1a15d6a223c2-kube-api-access-97gl5\") pod \"e872c241-3445-4382-a7f0-1a15d6a223c2\" (UID: \"e872c241-3445-4382-a7f0-1a15d6a223c2\") " Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.253044 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e872c241-3445-4382-a7f0-1a15d6a223c2-kube-api-access-97gl5" (OuterVolumeSpecName: "kube-api-access-97gl5") pod "e872c241-3445-4382-a7f0-1a15d6a223c2" (UID: "e872c241-3445-4382-a7f0-1a15d6a223c2"). InnerVolumeSpecName "kube-api-access-97gl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.275054 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-inventory" (OuterVolumeSpecName: "inventory") pod "e872c241-3445-4382-a7f0-1a15d6a223c2" (UID: "e872c241-3445-4382-a7f0-1a15d6a223c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.278214 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e872c241-3445-4382-a7f0-1a15d6a223c2" (UID: "e872c241-3445-4382-a7f0-1a15d6a223c2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.350122 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.350152 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e872c241-3445-4382-a7f0-1a15d6a223c2-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.350162 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97gl5\" (UniqueName: \"kubernetes.io/projected/e872c241-3445-4382-a7f0-1a15d6a223c2-kube-api-access-97gl5\") on node \"crc\" DevicePath \"\"" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.779544 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" event={"ID":"e872c241-3445-4382-a7f0-1a15d6a223c2","Type":"ContainerDied","Data":"5ad516c75630cb12d7ed681889e305cd7fe863fc205410e3be305490784e692c"} Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.779874 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad516c75630cb12d7ed681889e305cd7fe863fc205410e3be305490784e692c" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.779599 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.895262 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-f24zs"] Oct 03 14:36:58 crc kubenswrapper[4636]: E1003 14:36:58.895646 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e872c241-3445-4382-a7f0-1a15d6a223c2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.895661 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e872c241-3445-4382-a7f0-1a15d6a223c2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.895876 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="e872c241-3445-4382-a7f0-1a15d6a223c2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.897461 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.903591 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-f24zs"] Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.905462 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.906034 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.906143 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.906236 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.968613 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-f24zs\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.968671 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kxb6\" (UniqueName: \"kubernetes.io/projected/4fa2c95f-4798-46d0-8e21-31334d585714-kube-api-access-5kxb6\") pod \"ssh-known-hosts-edpm-deployment-f24zs\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:36:58 crc kubenswrapper[4636]: I1003 14:36:58.968742 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-f24zs\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:36:59 crc kubenswrapper[4636]: I1003 14:36:59.070881 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-f24zs\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:36:59 crc kubenswrapper[4636]: I1003 14:36:59.070925 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kxb6\" (UniqueName: \"kubernetes.io/projected/4fa2c95f-4798-46d0-8e21-31334d585714-kube-api-access-5kxb6\") pod \"ssh-known-hosts-edpm-deployment-f24zs\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:36:59 crc kubenswrapper[4636]: I1003 14:36:59.070980 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-f24zs\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:36:59 crc kubenswrapper[4636]: I1003 14:36:59.079786 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-f24zs\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:36:59 crc kubenswrapper[4636]: I1003 14:36:59.085814 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-f24zs\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:36:59 crc kubenswrapper[4636]: I1003 14:36:59.086518 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kxb6\" (UniqueName: \"kubernetes.io/projected/4fa2c95f-4798-46d0-8e21-31334d585714-kube-api-access-5kxb6\") pod \"ssh-known-hosts-edpm-deployment-f24zs\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:36:59 crc kubenswrapper[4636]: I1003 14:36:59.238125 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:36:59 crc kubenswrapper[4636]: I1003 14:36:59.757560 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-f24zs"] Oct 03 14:36:59 crc kubenswrapper[4636]: I1003 14:36:59.795858 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" event={"ID":"4fa2c95f-4798-46d0-8e21-31334d585714","Type":"ContainerStarted","Data":"7589ced898872730887bf98bc1383c9f481a5c28b0c50410038f87aa7fbc2f70"} Oct 03 14:37:00 crc kubenswrapper[4636]: I1003 14:37:00.807292 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" event={"ID":"4fa2c95f-4798-46d0-8e21-31334d585714","Type":"ContainerStarted","Data":"5e16c23a15a52837269d258900d2c834e62e831c8c75bd871717e64450db824e"} Oct 03 14:37:00 crc kubenswrapper[4636]: I1003 14:37:00.844653 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" podStartSLOduration=2.383962084 podStartE2EDuration="2.844629063s" podCreationTimestamp="2025-10-03 14:36:58 +0000 UTC" firstStartedPulling="2025-10-03 14:36:59.77168402 +0000 UTC m=+2169.630410267" lastFinishedPulling="2025-10-03 14:37:00.232350999 +0000 UTC m=+2170.091077246" observedRunningTime="2025-10-03 14:37:00.834301524 +0000 UTC m=+2170.693027791" watchObservedRunningTime="2025-10-03 14:37:00.844629063 +0000 UTC m=+2170.703355320" Oct 03 14:37:07 crc kubenswrapper[4636]: I1003 14:37:07.875872 4636 generic.go:334] "Generic (PLEG): container finished" podID="4fa2c95f-4798-46d0-8e21-31334d585714" containerID="5e16c23a15a52837269d258900d2c834e62e831c8c75bd871717e64450db824e" exitCode=0 Oct 03 14:37:07 crc kubenswrapper[4636]: I1003 14:37:07.875986 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" event={"ID":"4fa2c95f-4798-46d0-8e21-31334d585714","Type":"ContainerDied","Data":"5e16c23a15a52837269d258900d2c834e62e831c8c75bd871717e64450db824e"} Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.162929 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.163245 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.270587 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.364248 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kxb6\" (UniqueName: \"kubernetes.io/projected/4fa2c95f-4798-46d0-8e21-31334d585714-kube-api-access-5kxb6\") pod \"4fa2c95f-4798-46d0-8e21-31334d585714\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.364443 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-inventory-0\") pod \"4fa2c95f-4798-46d0-8e21-31334d585714\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.364495 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-ssh-key-openstack-edpm-ipam\") pod \"4fa2c95f-4798-46d0-8e21-31334d585714\" (UID: \"4fa2c95f-4798-46d0-8e21-31334d585714\") " Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.381122 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fa2c95f-4798-46d0-8e21-31334d585714-kube-api-access-5kxb6" (OuterVolumeSpecName: "kube-api-access-5kxb6") pod "4fa2c95f-4798-46d0-8e21-31334d585714" (UID: "4fa2c95f-4798-46d0-8e21-31334d585714"). InnerVolumeSpecName "kube-api-access-5kxb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.399203 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4fa2c95f-4798-46d0-8e21-31334d585714" (UID: "4fa2c95f-4798-46d0-8e21-31334d585714"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.401790 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4fa2c95f-4798-46d0-8e21-31334d585714" (UID: "4fa2c95f-4798-46d0-8e21-31334d585714"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.468722 4636 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.468807 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4fa2c95f-4798-46d0-8e21-31334d585714-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.468830 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kxb6\" (UniqueName: \"kubernetes.io/projected/4fa2c95f-4798-46d0-8e21-31334d585714-kube-api-access-5kxb6\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.892081 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" event={"ID":"4fa2c95f-4798-46d0-8e21-31334d585714","Type":"ContainerDied","Data":"7589ced898872730887bf98bc1383c9f481a5c28b0c50410038f87aa7fbc2f70"} Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.892134 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7589ced898872730887bf98bc1383c9f481a5c28b0c50410038f87aa7fbc2f70" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.892164 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-f24zs" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.982927 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2"] Oct 03 14:37:09 crc kubenswrapper[4636]: E1003 14:37:09.983636 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa2c95f-4798-46d0-8e21-31334d585714" containerName="ssh-known-hosts-edpm-deployment" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.983656 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa2c95f-4798-46d0-8e21-31334d585714" containerName="ssh-known-hosts-edpm-deployment" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.983902 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fa2c95f-4798-46d0-8e21-31334d585714" containerName="ssh-known-hosts-edpm-deployment" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.984652 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:09 crc kubenswrapper[4636]: I1003 14:37:09.995638 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2"] Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.003562 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.003750 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.004037 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.004498 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.078917 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpm2h\" (UniqueName: \"kubernetes.io/projected/1af273b7-459c-4175-9085-28fa11fb76ee-kube-api-access-gpm2h\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tlxw2\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.079297 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tlxw2\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.079576 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tlxw2\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.182530 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpm2h\" (UniqueName: \"kubernetes.io/projected/1af273b7-459c-4175-9085-28fa11fb76ee-kube-api-access-gpm2h\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tlxw2\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.182589 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tlxw2\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.182632 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tlxw2\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.188602 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tlxw2\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.200972 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tlxw2\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.204599 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpm2h\" (UniqueName: \"kubernetes.io/projected/1af273b7-459c-4175-9085-28fa11fb76ee-kube-api-access-gpm2h\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tlxw2\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.320887 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.833674 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2"] Oct 03 14:37:10 crc kubenswrapper[4636]: I1003 14:37:10.906055 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" event={"ID":"1af273b7-459c-4175-9085-28fa11fb76ee","Type":"ContainerStarted","Data":"45b5d8a17312e0d164f4923c16c70b1c3f9553e4c8dba5a7d83efa5c1876a281"} Oct 03 14:37:12 crc kubenswrapper[4636]: I1003 14:37:12.929913 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" event={"ID":"1af273b7-459c-4175-9085-28fa11fb76ee","Type":"ContainerStarted","Data":"c956764ef516cc40867e0fb41b21453b7a79712c0ff408c1b5483c464ed0287d"} Oct 03 14:37:12 crc kubenswrapper[4636]: I1003 14:37:12.949284 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" podStartSLOduration=2.791530878 podStartE2EDuration="3.949258461s" podCreationTimestamp="2025-10-03 14:37:09 +0000 UTC" firstStartedPulling="2025-10-03 14:37:10.831985888 +0000 UTC m=+2180.690712135" lastFinishedPulling="2025-10-03 14:37:11.989713471 +0000 UTC m=+2181.848439718" observedRunningTime="2025-10-03 14:37:12.944902744 +0000 UTC m=+2182.803628991" watchObservedRunningTime="2025-10-03 14:37:12.949258461 +0000 UTC m=+2182.807984708" Oct 03 14:37:21 crc kubenswrapper[4636]: I1003 14:37:21.002186 4636 generic.go:334] "Generic (PLEG): container finished" podID="1af273b7-459c-4175-9085-28fa11fb76ee" containerID="c956764ef516cc40867e0fb41b21453b7a79712c0ff408c1b5483c464ed0287d" exitCode=0 Oct 03 14:37:21 crc kubenswrapper[4636]: I1003 14:37:21.002776 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" event={"ID":"1af273b7-459c-4175-9085-28fa11fb76ee","Type":"ContainerDied","Data":"c956764ef516cc40867e0fb41b21453b7a79712c0ff408c1b5483c464ed0287d"} Oct 03 14:37:22 crc kubenswrapper[4636]: I1003 14:37:22.565508 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:22 crc kubenswrapper[4636]: I1003 14:37:22.620826 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-ssh-key\") pod \"1af273b7-459c-4175-9085-28fa11fb76ee\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " Oct 03 14:37:22 crc kubenswrapper[4636]: I1003 14:37:22.620953 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpm2h\" (UniqueName: \"kubernetes.io/projected/1af273b7-459c-4175-9085-28fa11fb76ee-kube-api-access-gpm2h\") pod \"1af273b7-459c-4175-9085-28fa11fb76ee\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " Oct 03 14:37:22 crc kubenswrapper[4636]: I1003 14:37:22.620992 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-inventory\") pod \"1af273b7-459c-4175-9085-28fa11fb76ee\" (UID: \"1af273b7-459c-4175-9085-28fa11fb76ee\") " Oct 03 14:37:22 crc kubenswrapper[4636]: I1003 14:37:22.628190 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af273b7-459c-4175-9085-28fa11fb76ee-kube-api-access-gpm2h" (OuterVolumeSpecName: "kube-api-access-gpm2h") pod "1af273b7-459c-4175-9085-28fa11fb76ee" (UID: "1af273b7-459c-4175-9085-28fa11fb76ee"). InnerVolumeSpecName "kube-api-access-gpm2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:37:22 crc kubenswrapper[4636]: I1003 14:37:22.656978 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1af273b7-459c-4175-9085-28fa11fb76ee" (UID: "1af273b7-459c-4175-9085-28fa11fb76ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:37:22 crc kubenswrapper[4636]: I1003 14:37:22.657597 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-inventory" (OuterVolumeSpecName: "inventory") pod "1af273b7-459c-4175-9085-28fa11fb76ee" (UID: "1af273b7-459c-4175-9085-28fa11fb76ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:37:22 crc kubenswrapper[4636]: I1003 14:37:22.721885 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:22 crc kubenswrapper[4636]: I1003 14:37:22.721920 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpm2h\" (UniqueName: \"kubernetes.io/projected/1af273b7-459c-4175-9085-28fa11fb76ee-kube-api-access-gpm2h\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:22 crc kubenswrapper[4636]: I1003 14:37:22.721931 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1af273b7-459c-4175-9085-28fa11fb76ee-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.019656 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" event={"ID":"1af273b7-459c-4175-9085-28fa11fb76ee","Type":"ContainerDied","Data":"45b5d8a17312e0d164f4923c16c70b1c3f9553e4c8dba5a7d83efa5c1876a281"} Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.019944 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45b5d8a17312e0d164f4923c16c70b1c3f9553e4c8dba5a7d83efa5c1876a281" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.019771 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tlxw2" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.115001 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9"] Oct 03 14:37:23 crc kubenswrapper[4636]: E1003 14:37:23.115592 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af273b7-459c-4175-9085-28fa11fb76ee" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.115609 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af273b7-459c-4175-9085-28fa11fb76ee" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.115792 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af273b7-459c-4175-9085-28fa11fb76ee" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.116445 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.118282 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.118511 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.118733 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.122014 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.139049 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9"] Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.232736 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68jww\" (UniqueName: \"kubernetes.io/projected/0057d92e-1564-4b8e-93e9-aee9f862501e-kube-api-access-68jww\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.232882 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.232957 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.334508 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.334606 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68jww\" (UniqueName: \"kubernetes.io/projected/0057d92e-1564-4b8e-93e9-aee9f862501e-kube-api-access-68jww\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.334684 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.343002 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.343013 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.363475 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68jww\" (UniqueName: \"kubernetes.io/projected/0057d92e-1564-4b8e-93e9-aee9f862501e-kube-api-access-68jww\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:23 crc kubenswrapper[4636]: I1003 14:37:23.433134 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:24 crc kubenswrapper[4636]: I1003 14:37:24.012208 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9"] Oct 03 14:37:24 crc kubenswrapper[4636]: I1003 14:37:24.056091 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" event={"ID":"0057d92e-1564-4b8e-93e9-aee9f862501e","Type":"ContainerStarted","Data":"742212203e46ffa4259411a92e19eb1c1c3d64174e0319ae5b5308b88dcf692a"} Oct 03 14:37:25 crc kubenswrapper[4636]: I1003 14:37:25.064372 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" event={"ID":"0057d92e-1564-4b8e-93e9-aee9f862501e","Type":"ContainerStarted","Data":"34c2fdb5f6dd38ad5ecd8994324b8ef76ec75dbeb6c2f232437be343e05cfc35"} Oct 03 14:37:25 crc kubenswrapper[4636]: I1003 14:37:25.090992 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" podStartSLOduration=1.6374498659999999 podStartE2EDuration="2.090974423s" podCreationTimestamp="2025-10-03 14:37:23 +0000 UTC" firstStartedPulling="2025-10-03 14:37:24.019417127 +0000 UTC m=+2193.878143374" lastFinishedPulling="2025-10-03 14:37:24.472941684 +0000 UTC m=+2194.331667931" observedRunningTime="2025-10-03 14:37:25.08123168 +0000 UTC m=+2194.939957927" watchObservedRunningTime="2025-10-03 14:37:25.090974423 +0000 UTC m=+2194.949700670" Oct 03 14:37:35 crc kubenswrapper[4636]: I1003 14:37:35.157617 4636 generic.go:334] "Generic (PLEG): container finished" podID="0057d92e-1564-4b8e-93e9-aee9f862501e" containerID="34c2fdb5f6dd38ad5ecd8994324b8ef76ec75dbeb6c2f232437be343e05cfc35" exitCode=0 Oct 03 14:37:35 crc kubenswrapper[4636]: I1003 14:37:35.157814 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" event={"ID":"0057d92e-1564-4b8e-93e9-aee9f862501e","Type":"ContainerDied","Data":"34c2fdb5f6dd38ad5ecd8994324b8ef76ec75dbeb6c2f232437be343e05cfc35"} Oct 03 14:37:36 crc kubenswrapper[4636]: I1003 14:37:36.550980 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:36 crc kubenswrapper[4636]: I1003 14:37:36.707642 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-ssh-key\") pod \"0057d92e-1564-4b8e-93e9-aee9f862501e\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " Oct 03 14:37:36 crc kubenswrapper[4636]: I1003 14:37:36.707740 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-inventory\") pod \"0057d92e-1564-4b8e-93e9-aee9f862501e\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " Oct 03 14:37:36 crc kubenswrapper[4636]: I1003 14:37:36.707839 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68jww\" (UniqueName: \"kubernetes.io/projected/0057d92e-1564-4b8e-93e9-aee9f862501e-kube-api-access-68jww\") pod \"0057d92e-1564-4b8e-93e9-aee9f862501e\" (UID: \"0057d92e-1564-4b8e-93e9-aee9f862501e\") " Oct 03 14:37:36 crc kubenswrapper[4636]: I1003 14:37:36.716351 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0057d92e-1564-4b8e-93e9-aee9f862501e-kube-api-access-68jww" (OuterVolumeSpecName: "kube-api-access-68jww") pod "0057d92e-1564-4b8e-93e9-aee9f862501e" (UID: "0057d92e-1564-4b8e-93e9-aee9f862501e"). InnerVolumeSpecName "kube-api-access-68jww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:37:36 crc kubenswrapper[4636]: I1003 14:37:36.732331 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-inventory" (OuterVolumeSpecName: "inventory") pod "0057d92e-1564-4b8e-93e9-aee9f862501e" (UID: "0057d92e-1564-4b8e-93e9-aee9f862501e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:37:36 crc kubenswrapper[4636]: I1003 14:37:36.733249 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0057d92e-1564-4b8e-93e9-aee9f862501e" (UID: "0057d92e-1564-4b8e-93e9-aee9f862501e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:37:36 crc kubenswrapper[4636]: I1003 14:37:36.811881 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:36 crc kubenswrapper[4636]: I1003 14:37:36.811953 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0057d92e-1564-4b8e-93e9-aee9f862501e-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:36 crc kubenswrapper[4636]: I1003 14:37:36.811979 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68jww\" (UniqueName: \"kubernetes.io/projected/0057d92e-1564-4b8e-93e9-aee9f862501e-kube-api-access-68jww\") on node \"crc\" DevicePath \"\"" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.189319 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" event={"ID":"0057d92e-1564-4b8e-93e9-aee9f862501e","Type":"ContainerDied","Data":"742212203e46ffa4259411a92e19eb1c1c3d64174e0319ae5b5308b88dcf692a"} Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.189356 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="742212203e46ffa4259411a92e19eb1c1c3d64174e0319ae5b5308b88dcf692a" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.189424 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.269513 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm"] Oct 03 14:37:37 crc kubenswrapper[4636]: E1003 14:37:37.272542 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0057d92e-1564-4b8e-93e9-aee9f862501e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.272564 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="0057d92e-1564-4b8e-93e9-aee9f862501e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.272769 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="0057d92e-1564-4b8e-93e9-aee9f862501e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.273569 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.278912 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.279419 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.279841 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.279977 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.280009 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.280278 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.280399 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.280513 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.292817 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm"] Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.422050 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.422114 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.422379 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.422582 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.422661 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.422714 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.422744 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.423060 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.423219 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.423294 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnppj\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-kube-api-access-tnppj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.423335 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.423412 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.423490 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.423541 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.525084 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.525199 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.525246 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.525276 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.525981 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.526016 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.526134 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.526183 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.526221 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnppj\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-kube-api-access-tnppj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.526244 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.526288 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.526317 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.526340 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.526379 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.534430 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.535258 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.535663 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.535789 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.537156 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.537676 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.539538 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.539842 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.540638 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.540642 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.541660 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.541969 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.543950 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.546384 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnppj\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-kube-api-access-tnppj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rldfm\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:37 crc kubenswrapper[4636]: I1003 14:37:37.614519 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:37:38 crc kubenswrapper[4636]: I1003 14:37:38.126163 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm"] Oct 03 14:37:38 crc kubenswrapper[4636]: I1003 14:37:38.198574 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" event={"ID":"9634671b-cf60-4cdf-9558-417432ff5401","Type":"ContainerStarted","Data":"1eb8f8cea98d7c6067e0b27397df4ac9759b04c7737a6fbeb1780cc4b16ef2c0"} Oct 03 14:37:39 crc kubenswrapper[4636]: I1003 14:37:39.163074 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:37:39 crc kubenswrapper[4636]: I1003 14:37:39.164792 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:37:39 crc kubenswrapper[4636]: I1003 14:37:39.208842 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" event={"ID":"9634671b-cf60-4cdf-9558-417432ff5401","Type":"ContainerStarted","Data":"b30503c3991107050cb86506af7aa9eb884d5fd4c30c1a6d967470643f40fded"} Oct 03 14:37:39 crc kubenswrapper[4636]: I1003 14:37:39.240374 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" podStartSLOduration=1.606548994 podStartE2EDuration="2.240351279s" podCreationTimestamp="2025-10-03 14:37:37 +0000 UTC" firstStartedPulling="2025-10-03 14:37:38.132911693 +0000 UTC m=+2207.991637940" lastFinishedPulling="2025-10-03 14:37:38.766713978 +0000 UTC m=+2208.625440225" observedRunningTime="2025-10-03 14:37:39.232600639 +0000 UTC m=+2209.091326886" watchObservedRunningTime="2025-10-03 14:37:39.240351279 +0000 UTC m=+2209.099077536" Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.379709 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kzpkq"] Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.382401 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.404163 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzpkq"] Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.480435 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-utilities\") pod \"community-operators-kzpkq\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.480486 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2vkn\" (UniqueName: \"kubernetes.io/projected/a86c3015-1a12-4a2a-a570-46a858afe5a0-kube-api-access-p2vkn\") pod \"community-operators-kzpkq\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.480544 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-catalog-content\") pod \"community-operators-kzpkq\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.582782 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-utilities\") pod \"community-operators-kzpkq\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.582837 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2vkn\" (UniqueName: \"kubernetes.io/projected/a86c3015-1a12-4a2a-a570-46a858afe5a0-kube-api-access-p2vkn\") pod \"community-operators-kzpkq\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.582904 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-catalog-content\") pod \"community-operators-kzpkq\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.583488 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-catalog-content\") pod \"community-operators-kzpkq\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.583574 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-utilities\") pod \"community-operators-kzpkq\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.605074 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2vkn\" (UniqueName: \"kubernetes.io/projected/a86c3015-1a12-4a2a-a570-46a858afe5a0-kube-api-access-p2vkn\") pod \"community-operators-kzpkq\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:04 crc kubenswrapper[4636]: I1003 14:38:04.702618 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:05 crc kubenswrapper[4636]: I1003 14:38:05.307200 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzpkq"] Oct 03 14:38:05 crc kubenswrapper[4636]: I1003 14:38:05.412440 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzpkq" event={"ID":"a86c3015-1a12-4a2a-a570-46a858afe5a0","Type":"ContainerStarted","Data":"0f2444acaf019428f8d3f3ee556ef86b2322d38b140cf3aab84f551bc9bf6745"} Oct 03 14:38:06 crc kubenswrapper[4636]: I1003 14:38:06.422255 4636 generic.go:334] "Generic (PLEG): container finished" podID="a86c3015-1a12-4a2a-a570-46a858afe5a0" containerID="b295a733ffb8f11cc7f54b0984bd9b9803f678148442421fa8175ab9920dc928" exitCode=0 Oct 03 14:38:06 crc kubenswrapper[4636]: I1003 14:38:06.422376 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzpkq" event={"ID":"a86c3015-1a12-4a2a-a570-46a858afe5a0","Type":"ContainerDied","Data":"b295a733ffb8f11cc7f54b0984bd9b9803f678148442421fa8175ab9920dc928"} Oct 03 14:38:07 crc kubenswrapper[4636]: I1003 14:38:07.433692 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzpkq" event={"ID":"a86c3015-1a12-4a2a-a570-46a858afe5a0","Type":"ContainerStarted","Data":"48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618"} Oct 03 14:38:09 crc kubenswrapper[4636]: I1003 14:38:09.162971 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:38:09 crc kubenswrapper[4636]: I1003 14:38:09.163027 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:38:09 crc kubenswrapper[4636]: I1003 14:38:09.163069 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:38:09 crc kubenswrapper[4636]: I1003 14:38:09.163864 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:38:09 crc kubenswrapper[4636]: I1003 14:38:09.163921 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" gracePeriod=600 Oct 03 14:38:09 crc kubenswrapper[4636]: E1003 14:38:09.287429 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:38:09 crc kubenswrapper[4636]: I1003 14:38:09.452230 4636 generic.go:334] "Generic (PLEG): container finished" podID="a86c3015-1a12-4a2a-a570-46a858afe5a0" containerID="48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618" exitCode=0 Oct 03 14:38:09 crc kubenswrapper[4636]: I1003 14:38:09.452304 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzpkq" event={"ID":"a86c3015-1a12-4a2a-a570-46a858afe5a0","Type":"ContainerDied","Data":"48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618"} Oct 03 14:38:09 crc kubenswrapper[4636]: I1003 14:38:09.456185 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" exitCode=0 Oct 03 14:38:09 crc kubenswrapper[4636]: I1003 14:38:09.456215 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab"} Oct 03 14:38:09 crc kubenswrapper[4636]: I1003 14:38:09.456248 4636 scope.go:117] "RemoveContainer" containerID="6bf347f17c6a57808711c5d59e3068eaceb6db558930f7501a3d5cd85b4d3b8e" Oct 03 14:38:09 crc kubenswrapper[4636]: I1003 14:38:09.456884 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:38:09 crc kubenswrapper[4636]: E1003 14:38:09.457139 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:38:10 crc kubenswrapper[4636]: I1003 14:38:10.473067 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzpkq" event={"ID":"a86c3015-1a12-4a2a-a570-46a858afe5a0","Type":"ContainerStarted","Data":"9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e"} Oct 03 14:38:14 crc kubenswrapper[4636]: I1003 14:38:14.703587 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:14 crc kubenswrapper[4636]: I1003 14:38:14.704005 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:14 crc kubenswrapper[4636]: I1003 14:38:14.746214 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:14 crc kubenswrapper[4636]: I1003 14:38:14.765779 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kzpkq" podStartSLOduration=7.2197882700000005 podStartE2EDuration="10.765759378s" podCreationTimestamp="2025-10-03 14:38:04 +0000 UTC" firstStartedPulling="2025-10-03 14:38:06.424655289 +0000 UTC m=+2236.283381536" lastFinishedPulling="2025-10-03 14:38:09.970626397 +0000 UTC m=+2239.829352644" observedRunningTime="2025-10-03 14:38:10.501937153 +0000 UTC m=+2240.360663410" watchObservedRunningTime="2025-10-03 14:38:14.765759378 +0000 UTC m=+2244.624485615" Oct 03 14:38:15 crc kubenswrapper[4636]: I1003 14:38:15.575612 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:15 crc kubenswrapper[4636]: I1003 14:38:15.636625 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzpkq"] Oct 03 14:38:17 crc kubenswrapper[4636]: I1003 14:38:17.545612 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kzpkq" podUID="a86c3015-1a12-4a2a-a570-46a858afe5a0" containerName="registry-server" containerID="cri-o://9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e" gracePeriod=2 Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.002929 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.169386 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-utilities\") pod \"a86c3015-1a12-4a2a-a570-46a858afe5a0\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.169548 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2vkn\" (UniqueName: \"kubernetes.io/projected/a86c3015-1a12-4a2a-a570-46a858afe5a0-kube-api-access-p2vkn\") pod \"a86c3015-1a12-4a2a-a570-46a858afe5a0\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.169629 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-catalog-content\") pod \"a86c3015-1a12-4a2a-a570-46a858afe5a0\" (UID: \"a86c3015-1a12-4a2a-a570-46a858afe5a0\") " Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.170233 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-utilities" (OuterVolumeSpecName: "utilities") pod "a86c3015-1a12-4a2a-a570-46a858afe5a0" (UID: "a86c3015-1a12-4a2a-a570-46a858afe5a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.175621 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a86c3015-1a12-4a2a-a570-46a858afe5a0-kube-api-access-p2vkn" (OuterVolumeSpecName: "kube-api-access-p2vkn") pod "a86c3015-1a12-4a2a-a570-46a858afe5a0" (UID: "a86c3015-1a12-4a2a-a570-46a858afe5a0"). InnerVolumeSpecName "kube-api-access-p2vkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.221436 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a86c3015-1a12-4a2a-a570-46a858afe5a0" (UID: "a86c3015-1a12-4a2a-a570-46a858afe5a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.272554 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.272601 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2vkn\" (UniqueName: \"kubernetes.io/projected/a86c3015-1a12-4a2a-a570-46a858afe5a0-kube-api-access-p2vkn\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.272614 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86c3015-1a12-4a2a-a570-46a858afe5a0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.563543 4636 generic.go:334] "Generic (PLEG): container finished" podID="a86c3015-1a12-4a2a-a570-46a858afe5a0" containerID="9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e" exitCode=0 Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.563606 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzpkq" event={"ID":"a86c3015-1a12-4a2a-a570-46a858afe5a0","Type":"ContainerDied","Data":"9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e"} Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.563650 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzpkq" event={"ID":"a86c3015-1a12-4a2a-a570-46a858afe5a0","Type":"ContainerDied","Data":"0f2444acaf019428f8d3f3ee556ef86b2322d38b140cf3aab84f551bc9bf6745"} Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.563679 4636 scope.go:117] "RemoveContainer" containerID="9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.563909 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzpkq" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.609946 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzpkq"] Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.619039 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kzpkq"] Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.620941 4636 scope.go:117] "RemoveContainer" containerID="48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.642411 4636 scope.go:117] "RemoveContainer" containerID="b295a733ffb8f11cc7f54b0984bd9b9803f678148442421fa8175ab9920dc928" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.684147 4636 scope.go:117] "RemoveContainer" containerID="9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e" Oct 03 14:38:18 crc kubenswrapper[4636]: E1003 14:38:18.684795 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e\": container with ID starting with 9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e not found: ID does not exist" containerID="9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.684851 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e"} err="failed to get container status \"9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e\": rpc error: code = NotFound desc = could not find container \"9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e\": container with ID starting with 9e5ae0973ee986ba80efdaf1c7617ff5acc3e45e7178d097f5c61b9cb206cb9e not found: ID does not exist" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.684883 4636 scope.go:117] "RemoveContainer" containerID="48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618" Oct 03 14:38:18 crc kubenswrapper[4636]: E1003 14:38:18.685298 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618\": container with ID starting with 48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618 not found: ID does not exist" containerID="48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.685321 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618"} err="failed to get container status \"48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618\": rpc error: code = NotFound desc = could not find container \"48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618\": container with ID starting with 48abbc299adf4edbe31cff0d52fc10351d3eeb013b324858334dd50bd8bf7618 not found: ID does not exist" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.685337 4636 scope.go:117] "RemoveContainer" containerID="b295a733ffb8f11cc7f54b0984bd9b9803f678148442421fa8175ab9920dc928" Oct 03 14:38:18 crc kubenswrapper[4636]: E1003 14:38:18.685659 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b295a733ffb8f11cc7f54b0984bd9b9803f678148442421fa8175ab9920dc928\": container with ID starting with b295a733ffb8f11cc7f54b0984bd9b9803f678148442421fa8175ab9920dc928 not found: ID does not exist" containerID="b295a733ffb8f11cc7f54b0984bd9b9803f678148442421fa8175ab9920dc928" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.685709 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b295a733ffb8f11cc7f54b0984bd9b9803f678148442421fa8175ab9920dc928"} err="failed to get container status \"b295a733ffb8f11cc7f54b0984bd9b9803f678148442421fa8175ab9920dc928\": rpc error: code = NotFound desc = could not find container \"b295a733ffb8f11cc7f54b0984bd9b9803f678148442421fa8175ab9920dc928\": container with ID starting with b295a733ffb8f11cc7f54b0984bd9b9803f678148442421fa8175ab9920dc928 not found: ID does not exist" Oct 03 14:38:18 crc kubenswrapper[4636]: I1003 14:38:18.806435 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a86c3015-1a12-4a2a-a570-46a858afe5a0" path="/var/lib/kubelet/pods/a86c3015-1a12-4a2a-a570-46a858afe5a0/volumes" Oct 03 14:38:19 crc kubenswrapper[4636]: I1003 14:38:19.607077 4636 generic.go:334] "Generic (PLEG): container finished" podID="9634671b-cf60-4cdf-9558-417432ff5401" containerID="b30503c3991107050cb86506af7aa9eb884d5fd4c30c1a6d967470643f40fded" exitCode=0 Oct 03 14:38:19 crc kubenswrapper[4636]: I1003 14:38:19.607139 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" event={"ID":"9634671b-cf60-4cdf-9558-417432ff5401","Type":"ContainerDied","Data":"b30503c3991107050cb86506af7aa9eb884d5fd4c30c1a6d967470643f40fded"} Oct 03 14:38:20 crc kubenswrapper[4636]: I1003 14:38:20.806989 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:38:20 crc kubenswrapper[4636]: E1003 14:38:20.808492 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.017027 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.133914 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-neutron-metadata-combined-ca-bundle\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134046 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134066 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnppj\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-kube-api-access-tnppj\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134166 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134184 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-telemetry-combined-ca-bundle\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134200 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-bootstrap-combined-ca-bundle\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134228 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134267 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-inventory\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134305 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ssh-key\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134336 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-repo-setup-combined-ca-bundle\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134371 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-nova-combined-ca-bundle\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134412 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ovn-combined-ca-bundle\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134441 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-libvirt-combined-ca-bundle\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.134473 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9634671b-cf60-4cdf-9558-417432ff5401\" (UID: \"9634671b-cf60-4cdf-9558-417432ff5401\") " Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.141156 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.141673 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.143347 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.143475 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-kube-api-access-tnppj" (OuterVolumeSpecName: "kube-api-access-tnppj") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "kube-api-access-tnppj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.143741 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.145894 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.146249 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.146587 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.146701 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.147047 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.149705 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.154637 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.165897 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-inventory" (OuterVolumeSpecName: "inventory") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.166617 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9634671b-cf60-4cdf-9558-417432ff5401" (UID: "9634671b-cf60-4cdf-9558-417432ff5401"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237061 4636 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237111 4636 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237127 4636 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237146 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnppj\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-kube-api-access-tnppj\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237158 4636 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237171 4636 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237182 4636 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237192 4636 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9634671b-cf60-4cdf-9558-417432ff5401-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237203 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237213 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237223 4636 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237234 4636 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237246 4636 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.237256 4636 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9634671b-cf60-4cdf-9558-417432ff5401-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.626623 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" event={"ID":"9634671b-cf60-4cdf-9558-417432ff5401","Type":"ContainerDied","Data":"1eb8f8cea98d7c6067e0b27397df4ac9759b04c7737a6fbeb1780cc4b16ef2c0"} Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.626665 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1eb8f8cea98d7c6067e0b27397df4ac9759b04c7737a6fbeb1780cc4b16ef2c0" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.626668 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rldfm" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.731854 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6"] Oct 03 14:38:21 crc kubenswrapper[4636]: E1003 14:38:21.732281 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86c3015-1a12-4a2a-a570-46a858afe5a0" containerName="extract-utilities" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.732301 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86c3015-1a12-4a2a-a570-46a858afe5a0" containerName="extract-utilities" Oct 03 14:38:21 crc kubenswrapper[4636]: E1003 14:38:21.732352 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86c3015-1a12-4a2a-a570-46a858afe5a0" containerName="registry-server" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.732359 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86c3015-1a12-4a2a-a570-46a858afe5a0" containerName="registry-server" Oct 03 14:38:21 crc kubenswrapper[4636]: E1003 14:38:21.732386 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9634671b-cf60-4cdf-9558-417432ff5401" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.732393 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9634671b-cf60-4cdf-9558-417432ff5401" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 14:38:21 crc kubenswrapper[4636]: E1003 14:38:21.732412 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86c3015-1a12-4a2a-a570-46a858afe5a0" containerName="extract-content" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.732422 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86c3015-1a12-4a2a-a570-46a858afe5a0" containerName="extract-content" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.732598 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="a86c3015-1a12-4a2a-a570-46a858afe5a0" containerName="registry-server" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.732608 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="9634671b-cf60-4cdf-9558-417432ff5401" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.733222 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.735937 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.736108 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.736248 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.736392 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.736503 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.746740 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6"] Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.846243 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.846288 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpp7x\" (UniqueName: \"kubernetes.io/projected/d1e8fa7f-c140-4196-8967-ca303b35e8c5-kube-api-access-kpp7x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.846392 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.846737 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.846774 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.948622 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.948661 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpp7x\" (UniqueName: \"kubernetes.io/projected/d1e8fa7f-c140-4196-8967-ca303b35e8c5-kube-api-access-kpp7x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.948733 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.948826 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.948854 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.949768 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.952277 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.957677 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.957859 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:21 crc kubenswrapper[4636]: I1003 14:38:21.964301 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpp7x\" (UniqueName: \"kubernetes.io/projected/d1e8fa7f-c140-4196-8967-ca303b35e8c5-kube-api-access-kpp7x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-frzc6\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:22 crc kubenswrapper[4636]: I1003 14:38:22.104590 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:38:22 crc kubenswrapper[4636]: I1003 14:38:22.648193 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6"] Oct 03 14:38:22 crc kubenswrapper[4636]: W1003 14:38:22.655859 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e8fa7f_c140_4196_8967_ca303b35e8c5.slice/crio-3e065bcb4a5204bc2f46a5897acb8564981a0399404c96940f7397b06f252105 WatchSource:0}: Error finding container 3e065bcb4a5204bc2f46a5897acb8564981a0399404c96940f7397b06f252105: Status 404 returned error can't find the container with id 3e065bcb4a5204bc2f46a5897acb8564981a0399404c96940f7397b06f252105 Oct 03 14:38:23 crc kubenswrapper[4636]: I1003 14:38:23.646013 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" event={"ID":"d1e8fa7f-c140-4196-8967-ca303b35e8c5","Type":"ContainerStarted","Data":"eade10c5a8e97e9ed7585e5c423adace92c80f3190f93ed0f4d29e6546a14547"} Oct 03 14:38:23 crc kubenswrapper[4636]: I1003 14:38:23.646060 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" event={"ID":"d1e8fa7f-c140-4196-8967-ca303b35e8c5","Type":"ContainerStarted","Data":"3e065bcb4a5204bc2f46a5897acb8564981a0399404c96940f7397b06f252105"} Oct 03 14:38:23 crc kubenswrapper[4636]: I1003 14:38:23.667725 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" podStartSLOduration=2.210251005 podStartE2EDuration="2.667705155s" podCreationTimestamp="2025-10-03 14:38:21 +0000 UTC" firstStartedPulling="2025-10-03 14:38:22.658062881 +0000 UTC m=+2252.516789118" lastFinishedPulling="2025-10-03 14:38:23.115517021 +0000 UTC m=+2252.974243268" observedRunningTime="2025-10-03 14:38:23.662123701 +0000 UTC m=+2253.520849948" watchObservedRunningTime="2025-10-03 14:38:23.667705155 +0000 UTC m=+2253.526431402" Oct 03 14:38:34 crc kubenswrapper[4636]: I1003 14:38:34.794914 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:38:34 crc kubenswrapper[4636]: E1003 14:38:34.795631 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:38:45 crc kubenswrapper[4636]: I1003 14:38:45.793662 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:38:45 crc kubenswrapper[4636]: E1003 14:38:45.794429 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:38:58 crc kubenswrapper[4636]: I1003 14:38:58.793796 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:38:58 crc kubenswrapper[4636]: E1003 14:38:58.794674 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:39:11 crc kubenswrapper[4636]: I1003 14:39:11.793608 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:39:11 crc kubenswrapper[4636]: E1003 14:39:11.794408 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:39:26 crc kubenswrapper[4636]: I1003 14:39:26.794049 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:39:26 crc kubenswrapper[4636]: E1003 14:39:26.794918 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:39:32 crc kubenswrapper[4636]: I1003 14:39:32.186388 4636 generic.go:334] "Generic (PLEG): container finished" podID="d1e8fa7f-c140-4196-8967-ca303b35e8c5" containerID="eade10c5a8e97e9ed7585e5c423adace92c80f3190f93ed0f4d29e6546a14547" exitCode=0 Oct 03 14:39:32 crc kubenswrapper[4636]: I1003 14:39:32.186455 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" event={"ID":"d1e8fa7f-c140-4196-8967-ca303b35e8c5","Type":"ContainerDied","Data":"eade10c5a8e97e9ed7585e5c423adace92c80f3190f93ed0f4d29e6546a14547"} Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.601221 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.673701 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovncontroller-config-0\") pod \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.673757 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-inventory\") pod \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.673796 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ssh-key\") pod \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.673887 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovn-combined-ca-bundle\") pod \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.673956 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpp7x\" (UniqueName: \"kubernetes.io/projected/d1e8fa7f-c140-4196-8967-ca303b35e8c5-kube-api-access-kpp7x\") pod \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\" (UID: \"d1e8fa7f-c140-4196-8967-ca303b35e8c5\") " Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.695202 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e8fa7f-c140-4196-8967-ca303b35e8c5-kube-api-access-kpp7x" (OuterVolumeSpecName: "kube-api-access-kpp7x") pod "d1e8fa7f-c140-4196-8967-ca303b35e8c5" (UID: "d1e8fa7f-c140-4196-8967-ca303b35e8c5"). InnerVolumeSpecName "kube-api-access-kpp7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.696370 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d1e8fa7f-c140-4196-8967-ca303b35e8c5" (UID: "d1e8fa7f-c140-4196-8967-ca303b35e8c5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.704895 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d1e8fa7f-c140-4196-8967-ca303b35e8c5" (UID: "d1e8fa7f-c140-4196-8967-ca303b35e8c5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.710732 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d1e8fa7f-c140-4196-8967-ca303b35e8c5" (UID: "d1e8fa7f-c140-4196-8967-ca303b35e8c5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.725768 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-inventory" (OuterVolumeSpecName: "inventory") pod "d1e8fa7f-c140-4196-8967-ca303b35e8c5" (UID: "d1e8fa7f-c140-4196-8967-ca303b35e8c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.775717 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpp7x\" (UniqueName: \"kubernetes.io/projected/d1e8fa7f-c140-4196-8967-ca303b35e8c5-kube-api-access-kpp7x\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.775766 4636 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.775781 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.775794 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:33 crc kubenswrapper[4636]: I1003 14:39:33.775804 4636 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e8fa7f-c140-4196-8967-ca303b35e8c5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.203772 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" event={"ID":"d1e8fa7f-c140-4196-8967-ca303b35e8c5","Type":"ContainerDied","Data":"3e065bcb4a5204bc2f46a5897acb8564981a0399404c96940f7397b06f252105"} Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.204075 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e065bcb4a5204bc2f46a5897acb8564981a0399404c96940f7397b06f252105" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.203823 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-frzc6" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.330253 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2"] Oct 03 14:39:34 crc kubenswrapper[4636]: E1003 14:39:34.330764 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e8fa7f-c140-4196-8967-ca303b35e8c5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.330799 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e8fa7f-c140-4196-8967-ca303b35e8c5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.331014 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e8fa7f-c140-4196-8967-ca303b35e8c5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.331711 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.337004 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.337020 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.337142 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.337905 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.338393 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.339188 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.362203 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2"] Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.384872 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.384957 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.384978 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.385027 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.385086 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn6rp\" (UniqueName: \"kubernetes.io/projected/4932588e-72ae-44a2-bc95-08cd792a140f-kube-api-access-hn6rp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.385117 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.487250 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.487319 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.487459 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.487546 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn6rp\" (UniqueName: \"kubernetes.io/projected/4932588e-72ae-44a2-bc95-08cd792a140f-kube-api-access-hn6rp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.487578 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.487715 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.491860 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.492063 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.494011 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.495487 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.498616 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.508987 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn6rp\" (UniqueName: \"kubernetes.io/projected/4932588e-72ae-44a2-bc95-08cd792a140f-kube-api-access-hn6rp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:34 crc kubenswrapper[4636]: I1003 14:39:34.689584 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:39:35 crc kubenswrapper[4636]: I1003 14:39:35.214228 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2"] Oct 03 14:39:35 crc kubenswrapper[4636]: W1003 14:39:35.223676 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4932588e_72ae_44a2_bc95_08cd792a140f.slice/crio-e8de121af7333fdb130c5eb48e152036ad42d5ec72cad4e70133161ca724c652 WatchSource:0}: Error finding container e8de121af7333fdb130c5eb48e152036ad42d5ec72cad4e70133161ca724c652: Status 404 returned error can't find the container with id e8de121af7333fdb130c5eb48e152036ad42d5ec72cad4e70133161ca724c652 Oct 03 14:39:36 crc kubenswrapper[4636]: I1003 14:39:36.223515 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" event={"ID":"4932588e-72ae-44a2-bc95-08cd792a140f","Type":"ContainerStarted","Data":"b9f64a8cd3f65ea33b72338fd737e737343851bb56111eda4e441943a41b9e8c"} Oct 03 14:39:36 crc kubenswrapper[4636]: I1003 14:39:36.224009 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" event={"ID":"4932588e-72ae-44a2-bc95-08cd792a140f","Type":"ContainerStarted","Data":"e8de121af7333fdb130c5eb48e152036ad42d5ec72cad4e70133161ca724c652"} Oct 03 14:39:36 crc kubenswrapper[4636]: I1003 14:39:36.247302 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" podStartSLOduration=1.805508584 podStartE2EDuration="2.247273759s" podCreationTimestamp="2025-10-03 14:39:34 +0000 UTC" firstStartedPulling="2025-10-03 14:39:35.227152145 +0000 UTC m=+2325.085878392" lastFinishedPulling="2025-10-03 14:39:35.66891732 +0000 UTC m=+2325.527643567" observedRunningTime="2025-10-03 14:39:36.244197579 +0000 UTC m=+2326.102923826" watchObservedRunningTime="2025-10-03 14:39:36.247273759 +0000 UTC m=+2326.106000016" Oct 03 14:39:39 crc kubenswrapper[4636]: I1003 14:39:39.793966 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:39:39 crc kubenswrapper[4636]: E1003 14:39:39.794962 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:39:54 crc kubenswrapper[4636]: I1003 14:39:54.795344 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:39:54 crc kubenswrapper[4636]: E1003 14:39:54.796334 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:40:07 crc kubenswrapper[4636]: I1003 14:40:07.793822 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:40:07 crc kubenswrapper[4636]: E1003 14:40:07.794610 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:40:22 crc kubenswrapper[4636]: I1003 14:40:22.793853 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:40:22 crc kubenswrapper[4636]: E1003 14:40:22.794639 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:40:26 crc kubenswrapper[4636]: I1003 14:40:26.637327 4636 generic.go:334] "Generic (PLEG): container finished" podID="4932588e-72ae-44a2-bc95-08cd792a140f" containerID="b9f64a8cd3f65ea33b72338fd737e737343851bb56111eda4e441943a41b9e8c" exitCode=0 Oct 03 14:40:26 crc kubenswrapper[4636]: I1003 14:40:26.637408 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" event={"ID":"4932588e-72ae-44a2-bc95-08cd792a140f","Type":"ContainerDied","Data":"b9f64a8cd3f65ea33b72338fd737e737343851bb56111eda4e441943a41b9e8c"} Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.072294 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.192060 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-inventory\") pod \"4932588e-72ae-44a2-bc95-08cd792a140f\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.192226 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-ssh-key\") pod \"4932588e-72ae-44a2-bc95-08cd792a140f\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.192323 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn6rp\" (UniqueName: \"kubernetes.io/projected/4932588e-72ae-44a2-bc95-08cd792a140f-kube-api-access-hn6rp\") pod \"4932588e-72ae-44a2-bc95-08cd792a140f\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.192441 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4932588e-72ae-44a2-bc95-08cd792a140f\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.192490 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-nova-metadata-neutron-config-0\") pod \"4932588e-72ae-44a2-bc95-08cd792a140f\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.192660 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-metadata-combined-ca-bundle\") pod \"4932588e-72ae-44a2-bc95-08cd792a140f\" (UID: \"4932588e-72ae-44a2-bc95-08cd792a140f\") " Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.197717 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4932588e-72ae-44a2-bc95-08cd792a140f-kube-api-access-hn6rp" (OuterVolumeSpecName: "kube-api-access-hn6rp") pod "4932588e-72ae-44a2-bc95-08cd792a140f" (UID: "4932588e-72ae-44a2-bc95-08cd792a140f"). InnerVolumeSpecName "kube-api-access-hn6rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.201524 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4932588e-72ae-44a2-bc95-08cd792a140f" (UID: "4932588e-72ae-44a2-bc95-08cd792a140f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.222124 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-inventory" (OuterVolumeSpecName: "inventory") pod "4932588e-72ae-44a2-bc95-08cd792a140f" (UID: "4932588e-72ae-44a2-bc95-08cd792a140f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.222247 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4932588e-72ae-44a2-bc95-08cd792a140f" (UID: "4932588e-72ae-44a2-bc95-08cd792a140f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.224421 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4932588e-72ae-44a2-bc95-08cd792a140f" (UID: "4932588e-72ae-44a2-bc95-08cd792a140f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.224902 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4932588e-72ae-44a2-bc95-08cd792a140f" (UID: "4932588e-72ae-44a2-bc95-08cd792a140f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.294986 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.295049 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.295060 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn6rp\" (UniqueName: \"kubernetes.io/projected/4932588e-72ae-44a2-bc95-08cd792a140f-kube-api-access-hn6rp\") on node \"crc\" DevicePath \"\"" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.295071 4636 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.295082 4636 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.295092 4636 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4932588e-72ae-44a2-bc95-08cd792a140f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.653025 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" event={"ID":"4932588e-72ae-44a2-bc95-08cd792a140f","Type":"ContainerDied","Data":"e8de121af7333fdb130c5eb48e152036ad42d5ec72cad4e70133161ca724c652"} Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.653371 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8de121af7333fdb130c5eb48e152036ad42d5ec72cad4e70133161ca724c652" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.653090 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.831154 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk"] Oct 03 14:40:28 crc kubenswrapper[4636]: E1003 14:40:28.831840 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4932588e-72ae-44a2-bc95-08cd792a140f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.831919 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="4932588e-72ae-44a2-bc95-08cd792a140f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.832193 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="4932588e-72ae-44a2-bc95-08cd792a140f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.832880 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.834687 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.835064 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.835064 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.835604 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.837919 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.854771 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk"] Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.909117 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.909164 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.909255 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.909332 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rptlk\" (UniqueName: \"kubernetes.io/projected/917285a7-3281-4326-8837-f1db2fe9a711-kube-api-access-rptlk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:28 crc kubenswrapper[4636]: I1003 14:40:28.909425 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.012708 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.012765 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.012837 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.012882 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rptlk\" (UniqueName: \"kubernetes.io/projected/917285a7-3281-4326-8837-f1db2fe9a711-kube-api-access-rptlk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.012959 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.018279 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.018914 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.026939 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.031757 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.033650 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rptlk\" (UniqueName: \"kubernetes.io/projected/917285a7-3281-4326-8837-f1db2fe9a711-kube-api-access-rptlk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.203650 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.730076 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk"] Oct 03 14:40:29 crc kubenswrapper[4636]: I1003 14:40:29.736446 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:40:30 crc kubenswrapper[4636]: I1003 14:40:30.673631 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" event={"ID":"917285a7-3281-4326-8837-f1db2fe9a711","Type":"ContainerStarted","Data":"3fa2899c144cba6138dae9fc53ae66f253a394a2c99e58d24040e95ed24e7f99"} Oct 03 14:40:30 crc kubenswrapper[4636]: I1003 14:40:30.674030 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" event={"ID":"917285a7-3281-4326-8837-f1db2fe9a711","Type":"ContainerStarted","Data":"2024b5eef85012662832b804ffc5f487520b82428bfa0588cadf21a9ff8a4557"} Oct 03 14:40:30 crc kubenswrapper[4636]: I1003 14:40:30.689267 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" podStartSLOduration=2.243808477 podStartE2EDuration="2.689249547s" podCreationTimestamp="2025-10-03 14:40:28 +0000 UTC" firstStartedPulling="2025-10-03 14:40:29.736180033 +0000 UTC m=+2379.594906290" lastFinishedPulling="2025-10-03 14:40:30.181621113 +0000 UTC m=+2380.040347360" observedRunningTime="2025-10-03 14:40:30.685825589 +0000 UTC m=+2380.544551856" watchObservedRunningTime="2025-10-03 14:40:30.689249547 +0000 UTC m=+2380.547975794" Oct 03 14:40:34 crc kubenswrapper[4636]: I1003 14:40:34.794311 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:40:34 crc kubenswrapper[4636]: E1003 14:40:34.796785 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:40:46 crc kubenswrapper[4636]: I1003 14:40:46.793773 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:40:46 crc kubenswrapper[4636]: E1003 14:40:46.794453 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:41:00 crc kubenswrapper[4636]: I1003 14:41:00.801721 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:41:00 crc kubenswrapper[4636]: E1003 14:41:00.803220 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:41:13 crc kubenswrapper[4636]: I1003 14:41:13.794707 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:41:13 crc kubenswrapper[4636]: E1003 14:41:13.795593 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:41:27 crc kubenswrapper[4636]: I1003 14:41:27.799562 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:41:27 crc kubenswrapper[4636]: E1003 14:41:27.800396 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:41:39 crc kubenswrapper[4636]: I1003 14:41:39.794819 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:41:39 crc kubenswrapper[4636]: E1003 14:41:39.797359 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:41:51 crc kubenswrapper[4636]: I1003 14:41:51.794324 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:41:51 crc kubenswrapper[4636]: E1003 14:41:51.795218 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:42:04 crc kubenswrapper[4636]: I1003 14:42:04.795152 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:42:04 crc kubenswrapper[4636]: E1003 14:42:04.795974 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:42:16 crc kubenswrapper[4636]: I1003 14:42:16.794562 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:42:16 crc kubenswrapper[4636]: E1003 14:42:16.795340 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:42:31 crc kubenswrapper[4636]: I1003 14:42:31.794299 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:42:31 crc kubenswrapper[4636]: E1003 14:42:31.796340 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:42:46 crc kubenswrapper[4636]: I1003 14:42:46.794481 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:42:46 crc kubenswrapper[4636]: E1003 14:42:46.796520 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:43:01 crc kubenswrapper[4636]: I1003 14:43:01.794763 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:43:01 crc kubenswrapper[4636]: E1003 14:43:01.795684 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:43:12 crc kubenswrapper[4636]: I1003 14:43:12.794313 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:43:13 crc kubenswrapper[4636]: I1003 14:43:13.068181 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"9b6189a9993a988a8228eca73a7aab98395784c64925bc5bfe9f78c4defdcb4d"} Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.166369 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-54q8l"] Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.169513 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.179570 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-54q8l"] Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.281604 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-utilities\") pod \"redhat-operators-54q8l\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.281673 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-catalog-content\") pod \"redhat-operators-54q8l\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.281741 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t9rn\" (UniqueName: \"kubernetes.io/projected/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-kube-api-access-2t9rn\") pod \"redhat-operators-54q8l\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.383998 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t9rn\" (UniqueName: \"kubernetes.io/projected/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-kube-api-access-2t9rn\") pod \"redhat-operators-54q8l\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.384199 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-utilities\") pod \"redhat-operators-54q8l\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.384241 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-catalog-content\") pod \"redhat-operators-54q8l\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.384811 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-catalog-content\") pod \"redhat-operators-54q8l\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.384818 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-utilities\") pod \"redhat-operators-54q8l\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.413891 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t9rn\" (UniqueName: \"kubernetes.io/projected/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-kube-api-access-2t9rn\") pod \"redhat-operators-54q8l\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:08 crc kubenswrapper[4636]: I1003 14:44:08.496574 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:09 crc kubenswrapper[4636]: I1003 14:44:09.033337 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-54q8l"] Oct 03 14:44:09 crc kubenswrapper[4636]: I1003 14:44:09.536780 4636 generic.go:334] "Generic (PLEG): container finished" podID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" containerID="846e6d015021ad9e42045c4c0c14324d0a53a65c640ed4439621738fdf9feb6f" exitCode=0 Oct 03 14:44:09 crc kubenswrapper[4636]: I1003 14:44:09.536817 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54q8l" event={"ID":"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d","Type":"ContainerDied","Data":"846e6d015021ad9e42045c4c0c14324d0a53a65c640ed4439621738fdf9feb6f"} Oct 03 14:44:09 crc kubenswrapper[4636]: I1003 14:44:09.537080 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54q8l" event={"ID":"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d","Type":"ContainerStarted","Data":"fe2b42b4451a674692a7b4ef5d79b67278ee14e71f3fd53bca9cc2c8f20f320f"} Oct 03 14:44:10 crc kubenswrapper[4636]: I1003 14:44:10.546292 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54q8l" event={"ID":"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d","Type":"ContainerStarted","Data":"6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755"} Oct 03 14:44:17 crc kubenswrapper[4636]: I1003 14:44:17.604613 4636 generic.go:334] "Generic (PLEG): container finished" podID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" containerID="6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755" exitCode=0 Oct 03 14:44:17 crc kubenswrapper[4636]: I1003 14:44:17.604692 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54q8l" event={"ID":"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d","Type":"ContainerDied","Data":"6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755"} Oct 03 14:44:18 crc kubenswrapper[4636]: I1003 14:44:18.617248 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54q8l" event={"ID":"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d","Type":"ContainerStarted","Data":"d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4"} Oct 03 14:44:18 crc kubenswrapper[4636]: I1003 14:44:18.640677 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-54q8l" podStartSLOduration=2.16506486 podStartE2EDuration="10.640658291s" podCreationTimestamp="2025-10-03 14:44:08 +0000 UTC" firstStartedPulling="2025-10-03 14:44:09.538348635 +0000 UTC m=+2599.397074882" lastFinishedPulling="2025-10-03 14:44:18.013942046 +0000 UTC m=+2607.872668313" observedRunningTime="2025-10-03 14:44:18.634438631 +0000 UTC m=+2608.493164898" watchObservedRunningTime="2025-10-03 14:44:18.640658291 +0000 UTC m=+2608.499384538" Oct 03 14:44:28 crc kubenswrapper[4636]: I1003 14:44:28.497794 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:28 crc kubenswrapper[4636]: I1003 14:44:28.498378 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:28 crc kubenswrapper[4636]: I1003 14:44:28.543677 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:28 crc kubenswrapper[4636]: I1003 14:44:28.748784 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:28 crc kubenswrapper[4636]: I1003 14:44:28.804516 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-54q8l"] Oct 03 14:44:30 crc kubenswrapper[4636]: I1003 14:44:30.713052 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-54q8l" podUID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" containerName="registry-server" containerID="cri-o://d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4" gracePeriod=2 Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.139357 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.307728 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-utilities\") pod \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.308123 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t9rn\" (UniqueName: \"kubernetes.io/projected/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-kube-api-access-2t9rn\") pod \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.308300 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-catalog-content\") pod \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\" (UID: \"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d\") " Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.309244 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-utilities" (OuterVolumeSpecName: "utilities") pod "5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" (UID: "5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.316011 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-kube-api-access-2t9rn" (OuterVolumeSpecName: "kube-api-access-2t9rn") pod "5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" (UID: "5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d"). InnerVolumeSpecName "kube-api-access-2t9rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.397962 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" (UID: "5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.410801 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t9rn\" (UniqueName: \"kubernetes.io/projected/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-kube-api-access-2t9rn\") on node \"crc\" DevicePath \"\"" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.410836 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.410845 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.724504 4636 generic.go:334] "Generic (PLEG): container finished" podID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" containerID="d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4" exitCode=0 Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.724565 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54q8l" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.724583 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54q8l" event={"ID":"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d","Type":"ContainerDied","Data":"d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4"} Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.725526 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54q8l" event={"ID":"5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d","Type":"ContainerDied","Data":"fe2b42b4451a674692a7b4ef5d79b67278ee14e71f3fd53bca9cc2c8f20f320f"} Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.725568 4636 scope.go:117] "RemoveContainer" containerID="d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.758982 4636 scope.go:117] "RemoveContainer" containerID="6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.768334 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-54q8l"] Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.781324 4636 scope.go:117] "RemoveContainer" containerID="846e6d015021ad9e42045c4c0c14324d0a53a65c640ed4439621738fdf9feb6f" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.782207 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-54q8l"] Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.825304 4636 scope.go:117] "RemoveContainer" containerID="d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4" Oct 03 14:44:31 crc kubenswrapper[4636]: E1003 14:44:31.826354 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4\": container with ID starting with d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4 not found: ID does not exist" containerID="d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.826406 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4"} err="failed to get container status \"d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4\": rpc error: code = NotFound desc = could not find container \"d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4\": container with ID starting with d0be9f24db916a73ed149c2727bff6703f9863c7bc76b3c36475b4c8d6e2e5f4 not found: ID does not exist" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.826440 4636 scope.go:117] "RemoveContainer" containerID="6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755" Oct 03 14:44:31 crc kubenswrapper[4636]: E1003 14:44:31.826897 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755\": container with ID starting with 6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755 not found: ID does not exist" containerID="6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.826940 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755"} err="failed to get container status \"6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755\": rpc error: code = NotFound desc = could not find container \"6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755\": container with ID starting with 6daa7ce3e6c912a94499d817511027f52e0fa9c550476e45ef8cd7f8577a5755 not found: ID does not exist" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.826957 4636 scope.go:117] "RemoveContainer" containerID="846e6d015021ad9e42045c4c0c14324d0a53a65c640ed4439621738fdf9feb6f" Oct 03 14:44:31 crc kubenswrapper[4636]: E1003 14:44:31.827337 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"846e6d015021ad9e42045c4c0c14324d0a53a65c640ed4439621738fdf9feb6f\": container with ID starting with 846e6d015021ad9e42045c4c0c14324d0a53a65c640ed4439621738fdf9feb6f not found: ID does not exist" containerID="846e6d015021ad9e42045c4c0c14324d0a53a65c640ed4439621738fdf9feb6f" Oct 03 14:44:31 crc kubenswrapper[4636]: I1003 14:44:31.827460 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846e6d015021ad9e42045c4c0c14324d0a53a65c640ed4439621738fdf9feb6f"} err="failed to get container status \"846e6d015021ad9e42045c4c0c14324d0a53a65c640ed4439621738fdf9feb6f\": rpc error: code = NotFound desc = could not find container \"846e6d015021ad9e42045c4c0c14324d0a53a65c640ed4439621738fdf9feb6f\": container with ID starting with 846e6d015021ad9e42045c4c0c14324d0a53a65c640ed4439621738fdf9feb6f not found: ID does not exist" Oct 03 14:44:32 crc kubenswrapper[4636]: I1003 14:44:32.804941 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" path="/var/lib/kubelet/pods/5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d/volumes" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.142750 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7"] Oct 03 14:45:00 crc kubenswrapper[4636]: E1003 14:45:00.143647 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" containerName="extract-content" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.143660 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" containerName="extract-content" Oct 03 14:45:00 crc kubenswrapper[4636]: E1003 14:45:00.143675 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" containerName="registry-server" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.143681 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" containerName="registry-server" Oct 03 14:45:00 crc kubenswrapper[4636]: E1003 14:45:00.143701 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" containerName="extract-utilities" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.143707 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" containerName="extract-utilities" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.143886 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e4e5b20-a692-4b2b-96fd-ae40fd74ca8d" containerName="registry-server" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.144653 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.151260 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.151479 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.157951 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7"] Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.255906 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c6ce88c-287d-4151-b6ab-5c36bb092862-config-volume\") pod \"collect-profiles-29325045-gqkt7\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.256208 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c6ce88c-287d-4151-b6ab-5c36bb092862-secret-volume\") pod \"collect-profiles-29325045-gqkt7\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.256475 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8zfj\" (UniqueName: \"kubernetes.io/projected/1c6ce88c-287d-4151-b6ab-5c36bb092862-kube-api-access-q8zfj\") pod \"collect-profiles-29325045-gqkt7\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.359121 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c6ce88c-287d-4151-b6ab-5c36bb092862-secret-volume\") pod \"collect-profiles-29325045-gqkt7\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.359231 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8zfj\" (UniqueName: \"kubernetes.io/projected/1c6ce88c-287d-4151-b6ab-5c36bb092862-kube-api-access-q8zfj\") pod \"collect-profiles-29325045-gqkt7\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.359395 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c6ce88c-287d-4151-b6ab-5c36bb092862-config-volume\") pod \"collect-profiles-29325045-gqkt7\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.360326 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c6ce88c-287d-4151-b6ab-5c36bb092862-config-volume\") pod \"collect-profiles-29325045-gqkt7\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.374286 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c6ce88c-287d-4151-b6ab-5c36bb092862-secret-volume\") pod \"collect-profiles-29325045-gqkt7\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.379765 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8zfj\" (UniqueName: \"kubernetes.io/projected/1c6ce88c-287d-4151-b6ab-5c36bb092862-kube-api-access-q8zfj\") pod \"collect-profiles-29325045-gqkt7\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.466957 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:00 crc kubenswrapper[4636]: I1003 14:45:00.971939 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7"] Oct 03 14:45:01 crc kubenswrapper[4636]: I1003 14:45:01.985659 4636 generic.go:334] "Generic (PLEG): container finished" podID="1c6ce88c-287d-4151-b6ab-5c36bb092862" containerID="99f6d2ac7a43a2363484d4f1e02e32a5b28030ec809b504acb8388411f23f5b0" exitCode=0 Oct 03 14:45:01 crc kubenswrapper[4636]: I1003 14:45:01.985737 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" event={"ID":"1c6ce88c-287d-4151-b6ab-5c36bb092862","Type":"ContainerDied","Data":"99f6d2ac7a43a2363484d4f1e02e32a5b28030ec809b504acb8388411f23f5b0"} Oct 03 14:45:01 crc kubenswrapper[4636]: I1003 14:45:01.985940 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" event={"ID":"1c6ce88c-287d-4151-b6ab-5c36bb092862","Type":"ContainerStarted","Data":"26810a2da942f35e1c639c2bdd8b7bd28eb4bd4a2c1a9251003b848470d8a4b2"} Oct 03 14:45:03 crc kubenswrapper[4636]: I1003 14:45:03.277383 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:03 crc kubenswrapper[4636]: I1003 14:45:03.418928 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c6ce88c-287d-4151-b6ab-5c36bb092862-secret-volume\") pod \"1c6ce88c-287d-4151-b6ab-5c36bb092862\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " Oct 03 14:45:03 crc kubenswrapper[4636]: I1003 14:45:03.419269 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8zfj\" (UniqueName: \"kubernetes.io/projected/1c6ce88c-287d-4151-b6ab-5c36bb092862-kube-api-access-q8zfj\") pod \"1c6ce88c-287d-4151-b6ab-5c36bb092862\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " Oct 03 14:45:03 crc kubenswrapper[4636]: I1003 14:45:03.420949 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c6ce88c-287d-4151-b6ab-5c36bb092862-config-volume\") pod \"1c6ce88c-287d-4151-b6ab-5c36bb092862\" (UID: \"1c6ce88c-287d-4151-b6ab-5c36bb092862\") " Oct 03 14:45:03 crc kubenswrapper[4636]: I1003 14:45:03.421897 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c6ce88c-287d-4151-b6ab-5c36bb092862-config-volume" (OuterVolumeSpecName: "config-volume") pod "1c6ce88c-287d-4151-b6ab-5c36bb092862" (UID: "1c6ce88c-287d-4151-b6ab-5c36bb092862"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:45:03 crc kubenswrapper[4636]: I1003 14:45:03.423414 4636 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c6ce88c-287d-4151-b6ab-5c36bb092862-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:03 crc kubenswrapper[4636]: I1003 14:45:03.428821 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c6ce88c-287d-4151-b6ab-5c36bb092862-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1c6ce88c-287d-4151-b6ab-5c36bb092862" (UID: "1c6ce88c-287d-4151-b6ab-5c36bb092862"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:45:03 crc kubenswrapper[4636]: I1003 14:45:03.435287 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6ce88c-287d-4151-b6ab-5c36bb092862-kube-api-access-q8zfj" (OuterVolumeSpecName: "kube-api-access-q8zfj") pod "1c6ce88c-287d-4151-b6ab-5c36bb092862" (UID: "1c6ce88c-287d-4151-b6ab-5c36bb092862"). InnerVolumeSpecName "kube-api-access-q8zfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:45:03 crc kubenswrapper[4636]: I1003 14:45:03.524840 4636 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c6ce88c-287d-4151-b6ab-5c36bb092862-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:03 crc kubenswrapper[4636]: I1003 14:45:03.525066 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8zfj\" (UniqueName: \"kubernetes.io/projected/1c6ce88c-287d-4151-b6ab-5c36bb092862-kube-api-access-q8zfj\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:04 crc kubenswrapper[4636]: I1003 14:45:04.005128 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" event={"ID":"1c6ce88c-287d-4151-b6ab-5c36bb092862","Type":"ContainerDied","Data":"26810a2da942f35e1c639c2bdd8b7bd28eb4bd4a2c1a9251003b848470d8a4b2"} Oct 03 14:45:04 crc kubenswrapper[4636]: I1003 14:45:04.005175 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26810a2da942f35e1c639c2bdd8b7bd28eb4bd4a2c1a9251003b848470d8a4b2" Oct 03 14:45:04 crc kubenswrapper[4636]: I1003 14:45:04.005181 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7" Oct 03 14:45:04 crc kubenswrapper[4636]: I1003 14:45:04.366646 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf"] Oct 03 14:45:04 crc kubenswrapper[4636]: I1003 14:45:04.375991 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325000-fk4kf"] Oct 03 14:45:04 crc kubenswrapper[4636]: I1003 14:45:04.807520 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64802881-57b2-4263-b5d8-f3c4c224c692" path="/var/lib/kubelet/pods/64802881-57b2-4263-b5d8-f3c4c224c692/volumes" Oct 03 14:45:09 crc kubenswrapper[4636]: I1003 14:45:09.057276 4636 generic.go:334] "Generic (PLEG): container finished" podID="917285a7-3281-4326-8837-f1db2fe9a711" containerID="3fa2899c144cba6138dae9fc53ae66f253a394a2c99e58d24040e95ed24e7f99" exitCode=0 Oct 03 14:45:09 crc kubenswrapper[4636]: I1003 14:45:09.057363 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" event={"ID":"917285a7-3281-4326-8837-f1db2fe9a711","Type":"ContainerDied","Data":"3fa2899c144cba6138dae9fc53ae66f253a394a2c99e58d24040e95ed24e7f99"} Oct 03 14:45:09 crc kubenswrapper[4636]: I1003 14:45:09.955054 4636 scope.go:117] "RemoveContainer" containerID="cf184242456787a73f2999d28c4eb1472742241b8b3a158f2e7c20f76ead3285" Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.429942 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.566469 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rptlk\" (UniqueName: \"kubernetes.io/projected/917285a7-3281-4326-8837-f1db2fe9a711-kube-api-access-rptlk\") pod \"917285a7-3281-4326-8837-f1db2fe9a711\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.566558 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-inventory\") pod \"917285a7-3281-4326-8837-f1db2fe9a711\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.566627 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-secret-0\") pod \"917285a7-3281-4326-8837-f1db2fe9a711\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.566703 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-combined-ca-bundle\") pod \"917285a7-3281-4326-8837-f1db2fe9a711\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.566784 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-ssh-key\") pod \"917285a7-3281-4326-8837-f1db2fe9a711\" (UID: \"917285a7-3281-4326-8837-f1db2fe9a711\") " Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.578818 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/917285a7-3281-4326-8837-f1db2fe9a711-kube-api-access-rptlk" (OuterVolumeSpecName: "kube-api-access-rptlk") pod "917285a7-3281-4326-8837-f1db2fe9a711" (UID: "917285a7-3281-4326-8837-f1db2fe9a711"). InnerVolumeSpecName "kube-api-access-rptlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.579230 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "917285a7-3281-4326-8837-f1db2fe9a711" (UID: "917285a7-3281-4326-8837-f1db2fe9a711"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.599961 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "917285a7-3281-4326-8837-f1db2fe9a711" (UID: "917285a7-3281-4326-8837-f1db2fe9a711"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.600137 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-inventory" (OuterVolumeSpecName: "inventory") pod "917285a7-3281-4326-8837-f1db2fe9a711" (UID: "917285a7-3281-4326-8837-f1db2fe9a711"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.603660 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "917285a7-3281-4326-8837-f1db2fe9a711" (UID: "917285a7-3281-4326-8837-f1db2fe9a711"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.669001 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rptlk\" (UniqueName: \"kubernetes.io/projected/917285a7-3281-4326-8837-f1db2fe9a711-kube-api-access-rptlk\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.669263 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.669361 4636 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.669433 4636 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:10 crc kubenswrapper[4636]: I1003 14:45:10.669496 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/917285a7-3281-4326-8837-f1db2fe9a711-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.078806 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" event={"ID":"917285a7-3281-4326-8837-f1db2fe9a711","Type":"ContainerDied","Data":"2024b5eef85012662832b804ffc5f487520b82428bfa0588cadf21a9ff8a4557"} Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.078881 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2024b5eef85012662832b804ffc5f487520b82428bfa0588cadf21a9ff8a4557" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.078948 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.279891 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp"] Oct 03 14:45:11 crc kubenswrapper[4636]: E1003 14:45:11.280263 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6ce88c-287d-4151-b6ab-5c36bb092862" containerName="collect-profiles" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.280281 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6ce88c-287d-4151-b6ab-5c36bb092862" containerName="collect-profiles" Oct 03 14:45:11 crc kubenswrapper[4636]: E1003 14:45:11.280327 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="917285a7-3281-4326-8837-f1db2fe9a711" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.280334 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="917285a7-3281-4326-8837-f1db2fe9a711" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.280491 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6ce88c-287d-4151-b6ab-5c36bb092862" containerName="collect-profiles" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.280517 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="917285a7-3281-4326-8837-f1db2fe9a711" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.281125 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.287018 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.287393 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.287504 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.288901 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.289877 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.292492 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.292955 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.300390 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp"] Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.382456 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.382514 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.382553 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.382674 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddxvb\" (UniqueName: \"kubernetes.io/projected/ee4e092c-de87-4547-a39a-1a451ef9dc64-kube-api-access-ddxvb\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.382730 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.382795 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.383061 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.383142 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.383169 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.484946 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.485683 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.485714 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.485794 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.485819 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.485852 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.485877 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddxvb\" (UniqueName: \"kubernetes.io/projected/ee4e092c-de87-4547-a39a-1a451ef9dc64-kube-api-access-ddxvb\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.485895 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.485925 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.486748 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.489122 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.489165 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.489577 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.489993 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.491969 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.492065 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.495916 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.504894 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddxvb\" (UniqueName: \"kubernetes.io/projected/ee4e092c-de87-4547-a39a-1a451ef9dc64-kube-api-access-ddxvb\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ggwhp\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:11 crc kubenswrapper[4636]: I1003 14:45:11.596439 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:45:12 crc kubenswrapper[4636]: I1003 14:45:12.127902 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp"] Oct 03 14:45:12 crc kubenswrapper[4636]: W1003 14:45:12.139958 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee4e092c_de87_4547_a39a_1a451ef9dc64.slice/crio-83247dfb0da13858ace760ef8827b81b9ec86fd3873f833844a58d4ca62d51c3 WatchSource:0}: Error finding container 83247dfb0da13858ace760ef8827b81b9ec86fd3873f833844a58d4ca62d51c3: Status 404 returned error can't find the container with id 83247dfb0da13858ace760ef8827b81b9ec86fd3873f833844a58d4ca62d51c3 Oct 03 14:45:13 crc kubenswrapper[4636]: I1003 14:45:13.101583 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" event={"ID":"ee4e092c-de87-4547-a39a-1a451ef9dc64","Type":"ContainerStarted","Data":"9f53e838ef5bae27a353590a84aec336acabe3cb06f777fd6edc6749dea22d51"} Oct 03 14:45:13 crc kubenswrapper[4636]: I1003 14:45:13.101898 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" event={"ID":"ee4e092c-de87-4547-a39a-1a451ef9dc64","Type":"ContainerStarted","Data":"83247dfb0da13858ace760ef8827b81b9ec86fd3873f833844a58d4ca62d51c3"} Oct 03 14:45:13 crc kubenswrapper[4636]: I1003 14:45:13.123515 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" podStartSLOduration=1.575477561 podStartE2EDuration="2.12349787s" podCreationTimestamp="2025-10-03 14:45:11 +0000 UTC" firstStartedPulling="2025-10-03 14:45:12.143388664 +0000 UTC m=+2662.002114911" lastFinishedPulling="2025-10-03 14:45:12.691408973 +0000 UTC m=+2662.550135220" observedRunningTime="2025-10-03 14:45:13.120903199 +0000 UTC m=+2662.979629456" watchObservedRunningTime="2025-10-03 14:45:13.12349787 +0000 UTC m=+2662.982224117" Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.380815 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z75ww"] Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.383299 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.391912 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z75ww"] Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.519252 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-catalog-content\") pod \"certified-operators-z75ww\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.519404 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnch5\" (UniqueName: \"kubernetes.io/projected/74b57add-2980-4266-82f1-145cbc2417f5-kube-api-access-tnch5\") pod \"certified-operators-z75ww\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.519535 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-utilities\") pod \"certified-operators-z75ww\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.622121 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-catalog-content\") pod \"certified-operators-z75ww\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.622226 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnch5\" (UniqueName: \"kubernetes.io/projected/74b57add-2980-4266-82f1-145cbc2417f5-kube-api-access-tnch5\") pod \"certified-operators-z75ww\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.622286 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-utilities\") pod \"certified-operators-z75ww\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.622831 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-utilities\") pod \"certified-operators-z75ww\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.623074 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-catalog-content\") pod \"certified-operators-z75ww\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.661606 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnch5\" (UniqueName: \"kubernetes.io/projected/74b57add-2980-4266-82f1-145cbc2417f5-kube-api-access-tnch5\") pod \"certified-operators-z75ww\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:34 crc kubenswrapper[4636]: I1003 14:45:34.702744 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:35 crc kubenswrapper[4636]: I1003 14:45:35.244741 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z75ww"] Oct 03 14:45:35 crc kubenswrapper[4636]: I1003 14:45:35.291660 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z75ww" event={"ID":"74b57add-2980-4266-82f1-145cbc2417f5","Type":"ContainerStarted","Data":"d3951ab110d0a8e157a3b9a8c0b85a71e224ca2ef9564f3e1bed281e285a22de"} Oct 03 14:45:36 crc kubenswrapper[4636]: I1003 14:45:36.303623 4636 generic.go:334] "Generic (PLEG): container finished" podID="74b57add-2980-4266-82f1-145cbc2417f5" containerID="0ebbade4f4c62c2e5c0902c90f8966e6ef9594d8ff960e8e23e6b09f7df8161b" exitCode=0 Oct 03 14:45:36 crc kubenswrapper[4636]: I1003 14:45:36.305049 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z75ww" event={"ID":"74b57add-2980-4266-82f1-145cbc2417f5","Type":"ContainerDied","Data":"0ebbade4f4c62c2e5c0902c90f8966e6ef9594d8ff960e8e23e6b09f7df8161b"} Oct 03 14:45:36 crc kubenswrapper[4636]: I1003 14:45:36.306002 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:45:37 crc kubenswrapper[4636]: I1003 14:45:37.312665 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z75ww" event={"ID":"74b57add-2980-4266-82f1-145cbc2417f5","Type":"ContainerStarted","Data":"ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec"} Oct 03 14:45:38 crc kubenswrapper[4636]: I1003 14:45:38.325978 4636 generic.go:334] "Generic (PLEG): container finished" podID="74b57add-2980-4266-82f1-145cbc2417f5" containerID="ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec" exitCode=0 Oct 03 14:45:38 crc kubenswrapper[4636]: I1003 14:45:38.326046 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z75ww" event={"ID":"74b57add-2980-4266-82f1-145cbc2417f5","Type":"ContainerDied","Data":"ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec"} Oct 03 14:45:39 crc kubenswrapper[4636]: I1003 14:45:39.162648 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:45:39 crc kubenswrapper[4636]: I1003 14:45:39.162713 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:45:39 crc kubenswrapper[4636]: I1003 14:45:39.336021 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z75ww" event={"ID":"74b57add-2980-4266-82f1-145cbc2417f5","Type":"ContainerStarted","Data":"b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf"} Oct 03 14:45:39 crc kubenswrapper[4636]: I1003 14:45:39.352384 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z75ww" podStartSLOduration=2.632538146 podStartE2EDuration="5.352365905s" podCreationTimestamp="2025-10-03 14:45:34 +0000 UTC" firstStartedPulling="2025-10-03 14:45:36.3058002 +0000 UTC m=+2686.164526447" lastFinishedPulling="2025-10-03 14:45:39.025627959 +0000 UTC m=+2688.884354206" observedRunningTime="2025-10-03 14:45:39.35091971 +0000 UTC m=+2689.209645967" watchObservedRunningTime="2025-10-03 14:45:39.352365905 +0000 UTC m=+2689.211092152" Oct 03 14:45:44 crc kubenswrapper[4636]: I1003 14:45:44.703970 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:44 crc kubenswrapper[4636]: I1003 14:45:44.704755 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:44 crc kubenswrapper[4636]: I1003 14:45:44.751527 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:45 crc kubenswrapper[4636]: I1003 14:45:45.448654 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:45 crc kubenswrapper[4636]: I1003 14:45:45.505452 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z75ww"] Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.423571 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z75ww" podUID="74b57add-2980-4266-82f1-145cbc2417f5" containerName="registry-server" containerID="cri-o://b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf" gracePeriod=2 Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.462003 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xt665"] Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.465342 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.477597 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt665"] Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.549427 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-catalog-content\") pod \"redhat-marketplace-xt665\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.549499 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx9k5\" (UniqueName: \"kubernetes.io/projected/0baa8584-ee5f-4b33-a602-deda2eadce76-kube-api-access-rx9k5\") pod \"redhat-marketplace-xt665\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.549611 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-utilities\") pod \"redhat-marketplace-xt665\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.651720 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-catalog-content\") pod \"redhat-marketplace-xt665\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.651808 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx9k5\" (UniqueName: \"kubernetes.io/projected/0baa8584-ee5f-4b33-a602-deda2eadce76-kube-api-access-rx9k5\") pod \"redhat-marketplace-xt665\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.651903 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-utilities\") pod \"redhat-marketplace-xt665\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.652560 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-utilities\") pod \"redhat-marketplace-xt665\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.652837 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-catalog-content\") pod \"redhat-marketplace-xt665\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.673656 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx9k5\" (UniqueName: \"kubernetes.io/projected/0baa8584-ee5f-4b33-a602-deda2eadce76-kube-api-access-rx9k5\") pod \"redhat-marketplace-xt665\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:47 crc kubenswrapper[4636]: I1003 14:45:47.883189 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.007613 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.060146 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-utilities\") pod \"74b57add-2980-4266-82f1-145cbc2417f5\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.060212 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnch5\" (UniqueName: \"kubernetes.io/projected/74b57add-2980-4266-82f1-145cbc2417f5-kube-api-access-tnch5\") pod \"74b57add-2980-4266-82f1-145cbc2417f5\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.060349 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-catalog-content\") pod \"74b57add-2980-4266-82f1-145cbc2417f5\" (UID: \"74b57add-2980-4266-82f1-145cbc2417f5\") " Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.061728 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-utilities" (OuterVolumeSpecName: "utilities") pod "74b57add-2980-4266-82f1-145cbc2417f5" (UID: "74b57add-2980-4266-82f1-145cbc2417f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.082262 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74b57add-2980-4266-82f1-145cbc2417f5-kube-api-access-tnch5" (OuterVolumeSpecName: "kube-api-access-tnch5") pod "74b57add-2980-4266-82f1-145cbc2417f5" (UID: "74b57add-2980-4266-82f1-145cbc2417f5"). InnerVolumeSpecName "kube-api-access-tnch5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.163618 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.163651 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnch5\" (UniqueName: \"kubernetes.io/projected/74b57add-2980-4266-82f1-145cbc2417f5-kube-api-access-tnch5\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.300245 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74b57add-2980-4266-82f1-145cbc2417f5" (UID: "74b57add-2980-4266-82f1-145cbc2417f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.373039 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74b57add-2980-4266-82f1-145cbc2417f5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.375361 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt665"] Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.445728 4636 generic.go:334] "Generic (PLEG): container finished" podID="74b57add-2980-4266-82f1-145cbc2417f5" containerID="b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf" exitCode=0 Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.445816 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z75ww" event={"ID":"74b57add-2980-4266-82f1-145cbc2417f5","Type":"ContainerDied","Data":"b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf"} Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.445876 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z75ww" event={"ID":"74b57add-2980-4266-82f1-145cbc2417f5","Type":"ContainerDied","Data":"d3951ab110d0a8e157a3b9a8c0b85a71e224ca2ef9564f3e1bed281e285a22de"} Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.445899 4636 scope.go:117] "RemoveContainer" containerID="b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.446124 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z75ww" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.450728 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt665" event={"ID":"0baa8584-ee5f-4b33-a602-deda2eadce76","Type":"ContainerStarted","Data":"4085c0d386af72baf439f12d55ff5cf2acfa817a7479187075deaa70cbeaa638"} Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.497289 4636 scope.go:117] "RemoveContainer" containerID="ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.499713 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z75ww"] Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.507888 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z75ww"] Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.520748 4636 scope.go:117] "RemoveContainer" containerID="0ebbade4f4c62c2e5c0902c90f8966e6ef9594d8ff960e8e23e6b09f7df8161b" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.543504 4636 scope.go:117] "RemoveContainer" containerID="b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf" Oct 03 14:45:48 crc kubenswrapper[4636]: E1003 14:45:48.543969 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf\": container with ID starting with b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf not found: ID does not exist" containerID="b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.544024 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf"} err="failed to get container status \"b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf\": rpc error: code = NotFound desc = could not find container \"b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf\": container with ID starting with b4ec5967f653bbccc60b454565c26b138c02d1b2f4c49468240597ea55642caf not found: ID does not exist" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.544054 4636 scope.go:117] "RemoveContainer" containerID="ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec" Oct 03 14:45:48 crc kubenswrapper[4636]: E1003 14:45:48.545669 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec\": container with ID starting with ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec not found: ID does not exist" containerID="ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.545804 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec"} err="failed to get container status \"ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec\": rpc error: code = NotFound desc = could not find container \"ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec\": container with ID starting with ada7a76458952e4b75606da5cd5883d7b7da9f62c15af8c80591f0fbdf48eaec not found: ID does not exist" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.545929 4636 scope.go:117] "RemoveContainer" containerID="0ebbade4f4c62c2e5c0902c90f8966e6ef9594d8ff960e8e23e6b09f7df8161b" Oct 03 14:45:48 crc kubenswrapper[4636]: E1003 14:45:48.546502 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ebbade4f4c62c2e5c0902c90f8966e6ef9594d8ff960e8e23e6b09f7df8161b\": container with ID starting with 0ebbade4f4c62c2e5c0902c90f8966e6ef9594d8ff960e8e23e6b09f7df8161b not found: ID does not exist" containerID="0ebbade4f4c62c2e5c0902c90f8966e6ef9594d8ff960e8e23e6b09f7df8161b" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.546616 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ebbade4f4c62c2e5c0902c90f8966e6ef9594d8ff960e8e23e6b09f7df8161b"} err="failed to get container status \"0ebbade4f4c62c2e5c0902c90f8966e6ef9594d8ff960e8e23e6b09f7df8161b\": rpc error: code = NotFound desc = could not find container \"0ebbade4f4c62c2e5c0902c90f8966e6ef9594d8ff960e8e23e6b09f7df8161b\": container with ID starting with 0ebbade4f4c62c2e5c0902c90f8966e6ef9594d8ff960e8e23e6b09f7df8161b not found: ID does not exist" Oct 03 14:45:48 crc kubenswrapper[4636]: I1003 14:45:48.806253 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74b57add-2980-4266-82f1-145cbc2417f5" path="/var/lib/kubelet/pods/74b57add-2980-4266-82f1-145cbc2417f5/volumes" Oct 03 14:45:49 crc kubenswrapper[4636]: I1003 14:45:49.463510 4636 generic.go:334] "Generic (PLEG): container finished" podID="0baa8584-ee5f-4b33-a602-deda2eadce76" containerID="84bd81f800abb54a4a62ae1d204ec10033478b63b4fffc01938c3fd67dc97c73" exitCode=0 Oct 03 14:45:49 crc kubenswrapper[4636]: I1003 14:45:49.463669 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt665" event={"ID":"0baa8584-ee5f-4b33-a602-deda2eadce76","Type":"ContainerDied","Data":"84bd81f800abb54a4a62ae1d204ec10033478b63b4fffc01938c3fd67dc97c73"} Oct 03 14:45:50 crc kubenswrapper[4636]: I1003 14:45:50.473211 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt665" event={"ID":"0baa8584-ee5f-4b33-a602-deda2eadce76","Type":"ContainerStarted","Data":"88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623"} Oct 03 14:45:51 crc kubenswrapper[4636]: I1003 14:45:51.484434 4636 generic.go:334] "Generic (PLEG): container finished" podID="0baa8584-ee5f-4b33-a602-deda2eadce76" containerID="88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623" exitCode=0 Oct 03 14:45:51 crc kubenswrapper[4636]: I1003 14:45:51.484527 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt665" event={"ID":"0baa8584-ee5f-4b33-a602-deda2eadce76","Type":"ContainerDied","Data":"88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623"} Oct 03 14:45:52 crc kubenswrapper[4636]: I1003 14:45:52.495001 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt665" event={"ID":"0baa8584-ee5f-4b33-a602-deda2eadce76","Type":"ContainerStarted","Data":"34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92"} Oct 03 14:45:52 crc kubenswrapper[4636]: I1003 14:45:52.515986 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xt665" podStartSLOduration=3.025790628 podStartE2EDuration="5.515966982s" podCreationTimestamp="2025-10-03 14:45:47 +0000 UTC" firstStartedPulling="2025-10-03 14:45:49.46565196 +0000 UTC m=+2699.324378207" lastFinishedPulling="2025-10-03 14:45:51.955828304 +0000 UTC m=+2701.814554561" observedRunningTime="2025-10-03 14:45:52.511123491 +0000 UTC m=+2702.369849758" watchObservedRunningTime="2025-10-03 14:45:52.515966982 +0000 UTC m=+2702.374693229" Oct 03 14:45:57 crc kubenswrapper[4636]: I1003 14:45:57.883586 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:57 crc kubenswrapper[4636]: I1003 14:45:57.884190 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:57 crc kubenswrapper[4636]: I1003 14:45:57.928584 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:58 crc kubenswrapper[4636]: I1003 14:45:58.589022 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:45:58 crc kubenswrapper[4636]: I1003 14:45:58.640970 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt665"] Oct 03 14:46:00 crc kubenswrapper[4636]: I1003 14:46:00.556640 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xt665" podUID="0baa8584-ee5f-4b33-a602-deda2eadce76" containerName="registry-server" containerID="cri-o://34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92" gracePeriod=2 Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.527388 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.573267 4636 generic.go:334] "Generic (PLEG): container finished" podID="0baa8584-ee5f-4b33-a602-deda2eadce76" containerID="34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92" exitCode=0 Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.573332 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt665" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.573344 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt665" event={"ID":"0baa8584-ee5f-4b33-a602-deda2eadce76","Type":"ContainerDied","Data":"34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92"} Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.574282 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt665" event={"ID":"0baa8584-ee5f-4b33-a602-deda2eadce76","Type":"ContainerDied","Data":"4085c0d386af72baf439f12d55ff5cf2acfa817a7479187075deaa70cbeaa638"} Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.574305 4636 scope.go:117] "RemoveContainer" containerID="34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.597901 4636 scope.go:117] "RemoveContainer" containerID="88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.618590 4636 scope.go:117] "RemoveContainer" containerID="84bd81f800abb54a4a62ae1d204ec10033478b63b4fffc01938c3fd67dc97c73" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.699061 4636 scope.go:117] "RemoveContainer" containerID="34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92" Oct 03 14:46:01 crc kubenswrapper[4636]: E1003 14:46:01.699500 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92\": container with ID starting with 34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92 not found: ID does not exist" containerID="34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.699530 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92"} err="failed to get container status \"34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92\": rpc error: code = NotFound desc = could not find container \"34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92\": container with ID starting with 34b4ede968d2fedb3c8efaa442ecfa2261827c5a41e7cea91d2da80bfe700c92 not found: ID does not exist" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.699553 4636 scope.go:117] "RemoveContainer" containerID="88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623" Oct 03 14:46:01 crc kubenswrapper[4636]: E1003 14:46:01.699905 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623\": container with ID starting with 88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623 not found: ID does not exist" containerID="88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.699945 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623"} err="failed to get container status \"88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623\": rpc error: code = NotFound desc = could not find container \"88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623\": container with ID starting with 88d910167d0ba01f14c71cf56e4b414d884d25d141e22ad6933a974562c0f623 not found: ID does not exist" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.699975 4636 scope.go:117] "RemoveContainer" containerID="84bd81f800abb54a4a62ae1d204ec10033478b63b4fffc01938c3fd67dc97c73" Oct 03 14:46:01 crc kubenswrapper[4636]: E1003 14:46:01.700413 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84bd81f800abb54a4a62ae1d204ec10033478b63b4fffc01938c3fd67dc97c73\": container with ID starting with 84bd81f800abb54a4a62ae1d204ec10033478b63b4fffc01938c3fd67dc97c73 not found: ID does not exist" containerID="84bd81f800abb54a4a62ae1d204ec10033478b63b4fffc01938c3fd67dc97c73" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.700454 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84bd81f800abb54a4a62ae1d204ec10033478b63b4fffc01938c3fd67dc97c73"} err="failed to get container status \"84bd81f800abb54a4a62ae1d204ec10033478b63b4fffc01938c3fd67dc97c73\": rpc error: code = NotFound desc = could not find container \"84bd81f800abb54a4a62ae1d204ec10033478b63b4fffc01938c3fd67dc97c73\": container with ID starting with 84bd81f800abb54a4a62ae1d204ec10033478b63b4fffc01938c3fd67dc97c73 not found: ID does not exist" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.719735 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx9k5\" (UniqueName: \"kubernetes.io/projected/0baa8584-ee5f-4b33-a602-deda2eadce76-kube-api-access-rx9k5\") pod \"0baa8584-ee5f-4b33-a602-deda2eadce76\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.719795 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-catalog-content\") pod \"0baa8584-ee5f-4b33-a602-deda2eadce76\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.719856 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-utilities\") pod \"0baa8584-ee5f-4b33-a602-deda2eadce76\" (UID: \"0baa8584-ee5f-4b33-a602-deda2eadce76\") " Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.721212 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-utilities" (OuterVolumeSpecName: "utilities") pod "0baa8584-ee5f-4b33-a602-deda2eadce76" (UID: "0baa8584-ee5f-4b33-a602-deda2eadce76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.726818 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0baa8584-ee5f-4b33-a602-deda2eadce76-kube-api-access-rx9k5" (OuterVolumeSpecName: "kube-api-access-rx9k5") pod "0baa8584-ee5f-4b33-a602-deda2eadce76" (UID: "0baa8584-ee5f-4b33-a602-deda2eadce76"). InnerVolumeSpecName "kube-api-access-rx9k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.735803 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0baa8584-ee5f-4b33-a602-deda2eadce76" (UID: "0baa8584-ee5f-4b33-a602-deda2eadce76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.822849 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx9k5\" (UniqueName: \"kubernetes.io/projected/0baa8584-ee5f-4b33-a602-deda2eadce76-kube-api-access-rx9k5\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.822898 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.822908 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0baa8584-ee5f-4b33-a602-deda2eadce76-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.908752 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt665"] Oct 03 14:46:01 crc kubenswrapper[4636]: I1003 14:46:01.917134 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt665"] Oct 03 14:46:02 crc kubenswrapper[4636]: I1003 14:46:02.803826 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0baa8584-ee5f-4b33-a602-deda2eadce76" path="/var/lib/kubelet/pods/0baa8584-ee5f-4b33-a602-deda2eadce76/volumes" Oct 03 14:46:09 crc kubenswrapper[4636]: I1003 14:46:09.162666 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:46:09 crc kubenswrapper[4636]: I1003 14:46:09.163298 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:46:39 crc kubenswrapper[4636]: I1003 14:46:39.163331 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:46:39 crc kubenswrapper[4636]: I1003 14:46:39.163743 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:46:39 crc kubenswrapper[4636]: I1003 14:46:39.163784 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:46:39 crc kubenswrapper[4636]: I1003 14:46:39.164256 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b6189a9993a988a8228eca73a7aab98395784c64925bc5bfe9f78c4defdcb4d"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:46:39 crc kubenswrapper[4636]: I1003 14:46:39.164313 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://9b6189a9993a988a8228eca73a7aab98395784c64925bc5bfe9f78c4defdcb4d" gracePeriod=600 Oct 03 14:46:39 crc kubenswrapper[4636]: I1003 14:46:39.908957 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="9b6189a9993a988a8228eca73a7aab98395784c64925bc5bfe9f78c4defdcb4d" exitCode=0 Oct 03 14:46:39 crc kubenswrapper[4636]: I1003 14:46:39.909035 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"9b6189a9993a988a8228eca73a7aab98395784c64925bc5bfe9f78c4defdcb4d"} Oct 03 14:46:39 crc kubenswrapper[4636]: I1003 14:46:39.909378 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9"} Oct 03 14:46:39 crc kubenswrapper[4636]: I1003 14:46:39.909454 4636 scope.go:117] "RemoveContainer" containerID="021bfe60326ec3ff3080b458f87625e5560cd0d5dff11c589b4a4aa178b283ab" Oct 03 14:48:39 crc kubenswrapper[4636]: I1003 14:48:39.162899 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:48:39 crc kubenswrapper[4636]: I1003 14:48:39.163462 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:48:51 crc kubenswrapper[4636]: I1003 14:48:51.017572 4636 generic.go:334] "Generic (PLEG): container finished" podID="ee4e092c-de87-4547-a39a-1a451ef9dc64" containerID="9f53e838ef5bae27a353590a84aec336acabe3cb06f777fd6edc6749dea22d51" exitCode=0 Oct 03 14:48:51 crc kubenswrapper[4636]: I1003 14:48:51.017648 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" event={"ID":"ee4e092c-de87-4547-a39a-1a451ef9dc64","Type":"ContainerDied","Data":"9f53e838ef5bae27a353590a84aec336acabe3cb06f777fd6edc6749dea22d51"} Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.467069 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.568200 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-1\") pod \"ee4e092c-de87-4547-a39a-1a451ef9dc64\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.568249 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-ssh-key\") pod \"ee4e092c-de87-4547-a39a-1a451ef9dc64\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.568285 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-inventory\") pod \"ee4e092c-de87-4547-a39a-1a451ef9dc64\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.568327 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-0\") pod \"ee4e092c-de87-4547-a39a-1a451ef9dc64\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.568433 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-0\") pod \"ee4e092c-de87-4547-a39a-1a451ef9dc64\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.568452 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-combined-ca-bundle\") pod \"ee4e092c-de87-4547-a39a-1a451ef9dc64\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.568513 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddxvb\" (UniqueName: \"kubernetes.io/projected/ee4e092c-de87-4547-a39a-1a451ef9dc64-kube-api-access-ddxvb\") pod \"ee4e092c-de87-4547-a39a-1a451ef9dc64\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.568549 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-extra-config-0\") pod \"ee4e092c-de87-4547-a39a-1a451ef9dc64\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.568810 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-1\") pod \"ee4e092c-de87-4547-a39a-1a451ef9dc64\" (UID: \"ee4e092c-de87-4547-a39a-1a451ef9dc64\") " Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.604564 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "ee4e092c-de87-4547-a39a-1a451ef9dc64" (UID: "ee4e092c-de87-4547-a39a-1a451ef9dc64"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.606432 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee4e092c-de87-4547-a39a-1a451ef9dc64-kube-api-access-ddxvb" (OuterVolumeSpecName: "kube-api-access-ddxvb") pod "ee4e092c-de87-4547-a39a-1a451ef9dc64" (UID: "ee4e092c-de87-4547-a39a-1a451ef9dc64"). InnerVolumeSpecName "kube-api-access-ddxvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.617644 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "ee4e092c-de87-4547-a39a-1a451ef9dc64" (UID: "ee4e092c-de87-4547-a39a-1a451ef9dc64"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.625728 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "ee4e092c-de87-4547-a39a-1a451ef9dc64" (UID: "ee4e092c-de87-4547-a39a-1a451ef9dc64"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.645255 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ee4e092c-de87-4547-a39a-1a451ef9dc64" (UID: "ee4e092c-de87-4547-a39a-1a451ef9dc64"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.664426 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "ee4e092c-de87-4547-a39a-1a451ef9dc64" (UID: "ee4e092c-de87-4547-a39a-1a451ef9dc64"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.665434 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "ee4e092c-de87-4547-a39a-1a451ef9dc64" (UID: "ee4e092c-de87-4547-a39a-1a451ef9dc64"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.666402 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-inventory" (OuterVolumeSpecName: "inventory") pod "ee4e092c-de87-4547-a39a-1a451ef9dc64" (UID: "ee4e092c-de87-4547-a39a-1a451ef9dc64"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.671129 4636 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.671155 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddxvb\" (UniqueName: \"kubernetes.io/projected/ee4e092c-de87-4547-a39a-1a451ef9dc64-kube-api-access-ddxvb\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.671166 4636 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.671177 4636 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.671188 4636 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.671199 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.671209 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.671219 4636 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.673611 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "ee4e092c-de87-4547-a39a-1a451ef9dc64" (UID: "ee4e092c-de87-4547-a39a-1a451ef9dc64"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:48:52 crc kubenswrapper[4636]: I1003 14:48:52.773277 4636 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ee4e092c-de87-4547-a39a-1a451ef9dc64-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.038239 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" event={"ID":"ee4e092c-de87-4547-a39a-1a451ef9dc64","Type":"ContainerDied","Data":"83247dfb0da13858ace760ef8827b81b9ec86fd3873f833844a58d4ca62d51c3"} Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.038279 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ggwhp" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.038284 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83247dfb0da13858ace760ef8827b81b9ec86fd3873f833844a58d4ca62d51c3" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.233541 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj"] Oct 03 14:48:53 crc kubenswrapper[4636]: E1003 14:48:53.234310 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b57add-2980-4266-82f1-145cbc2417f5" containerName="extract-content" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.234328 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b57add-2980-4266-82f1-145cbc2417f5" containerName="extract-content" Oct 03 14:48:53 crc kubenswrapper[4636]: E1003 14:48:53.234347 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0baa8584-ee5f-4b33-a602-deda2eadce76" containerName="extract-content" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.234353 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="0baa8584-ee5f-4b33-a602-deda2eadce76" containerName="extract-content" Oct 03 14:48:53 crc kubenswrapper[4636]: E1003 14:48:53.234361 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0baa8584-ee5f-4b33-a602-deda2eadce76" containerName="extract-utilities" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.234368 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="0baa8584-ee5f-4b33-a602-deda2eadce76" containerName="extract-utilities" Oct 03 14:48:53 crc kubenswrapper[4636]: E1003 14:48:53.234379 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4e092c-de87-4547-a39a-1a451ef9dc64" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.234385 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4e092c-de87-4547-a39a-1a451ef9dc64" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 14:48:53 crc kubenswrapper[4636]: E1003 14:48:53.234392 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b57add-2980-4266-82f1-145cbc2417f5" containerName="extract-utilities" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.234398 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b57add-2980-4266-82f1-145cbc2417f5" containerName="extract-utilities" Oct 03 14:48:53 crc kubenswrapper[4636]: E1003 14:48:53.234417 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0baa8584-ee5f-4b33-a602-deda2eadce76" containerName="registry-server" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.234423 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="0baa8584-ee5f-4b33-a602-deda2eadce76" containerName="registry-server" Oct 03 14:48:53 crc kubenswrapper[4636]: E1003 14:48:53.234447 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b57add-2980-4266-82f1-145cbc2417f5" containerName="registry-server" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.234453 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b57add-2980-4266-82f1-145cbc2417f5" containerName="registry-server" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.234614 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee4e092c-de87-4547-a39a-1a451ef9dc64" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.234633 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="0baa8584-ee5f-4b33-a602-deda2eadce76" containerName="registry-server" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.234645 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="74b57add-2980-4266-82f1-145cbc2417f5" containerName="registry-server" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.235233 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.237450 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.237510 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8b7c8" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.238181 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.238562 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.238779 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.252697 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj"] Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.384492 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.384652 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.384710 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.384812 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.384851 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.384923 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzc2n\" (UniqueName: \"kubernetes.io/projected/88e3290e-0c0d-4304-bcd2-b500068dc443-kube-api-access-jzc2n\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.384988 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.487039 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.487158 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.487198 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.487228 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.487243 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.487269 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzc2n\" (UniqueName: \"kubernetes.io/projected/88e3290e-0c0d-4304-bcd2-b500068dc443-kube-api-access-jzc2n\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.487302 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.491327 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.491536 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.492136 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.492833 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.502816 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.502943 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.513176 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzc2n\" (UniqueName: \"kubernetes.io/projected/88e3290e-0c0d-4304-bcd2-b500068dc443-kube-api-access-jzc2n\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4njmj\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:53 crc kubenswrapper[4636]: I1003 14:48:53.554222 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:48:54 crc kubenswrapper[4636]: I1003 14:48:54.132195 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj"] Oct 03 14:48:55 crc kubenswrapper[4636]: I1003 14:48:55.056615 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" event={"ID":"88e3290e-0c0d-4304-bcd2-b500068dc443","Type":"ContainerStarted","Data":"86c415f79a5609ed284b044e3d6de0b99f7c2d463a15c508586d2b720ae0d7ab"} Oct 03 14:48:55 crc kubenswrapper[4636]: I1003 14:48:55.056954 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" event={"ID":"88e3290e-0c0d-4304-bcd2-b500068dc443","Type":"ContainerStarted","Data":"de628da2f3424a1066ce35ea101557cbf7db6ee80e20339fc8c7437474a42538"} Oct 03 14:48:55 crc kubenswrapper[4636]: I1003 14:48:55.080855 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" podStartSLOduration=1.609296179 podStartE2EDuration="2.080837484s" podCreationTimestamp="2025-10-03 14:48:53 +0000 UTC" firstStartedPulling="2025-10-03 14:48:54.141795879 +0000 UTC m=+2884.000522126" lastFinishedPulling="2025-10-03 14:48:54.613337184 +0000 UTC m=+2884.472063431" observedRunningTime="2025-10-03 14:48:55.072528737 +0000 UTC m=+2884.931255014" watchObservedRunningTime="2025-10-03 14:48:55.080837484 +0000 UTC m=+2884.939563731" Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.717052 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bdqk2"] Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.720709 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.741730 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bdqk2"] Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.767786 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdszr\" (UniqueName: \"kubernetes.io/projected/51ecaf38-b4a1-477f-9620-6104f3b5787c-kube-api-access-xdszr\") pod \"community-operators-bdqk2\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.767974 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-catalog-content\") pod \"community-operators-bdqk2\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.768005 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-utilities\") pod \"community-operators-bdqk2\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.869067 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-catalog-content\") pod \"community-operators-bdqk2\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.869321 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-utilities\") pod \"community-operators-bdqk2\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.869644 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-catalog-content\") pod \"community-operators-bdqk2\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.869782 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-utilities\") pod \"community-operators-bdqk2\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.869954 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdszr\" (UniqueName: \"kubernetes.io/projected/51ecaf38-b4a1-477f-9620-6104f3b5787c-kube-api-access-xdszr\") pod \"community-operators-bdqk2\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:07 crc kubenswrapper[4636]: I1003 14:49:07.899842 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdszr\" (UniqueName: \"kubernetes.io/projected/51ecaf38-b4a1-477f-9620-6104f3b5787c-kube-api-access-xdszr\") pod \"community-operators-bdqk2\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:08 crc kubenswrapper[4636]: I1003 14:49:08.050319 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:08 crc kubenswrapper[4636]: I1003 14:49:08.604105 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bdqk2"] Oct 03 14:49:09 crc kubenswrapper[4636]: I1003 14:49:09.163320 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:49:09 crc kubenswrapper[4636]: I1003 14:49:09.163592 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:49:09 crc kubenswrapper[4636]: I1003 14:49:09.185786 4636 generic.go:334] "Generic (PLEG): container finished" podID="51ecaf38-b4a1-477f-9620-6104f3b5787c" containerID="4cda84113b962f08c4bf5b3c512b338b0432576385450b08562c957856d1a288" exitCode=0 Oct 03 14:49:09 crc kubenswrapper[4636]: I1003 14:49:09.185824 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqk2" event={"ID":"51ecaf38-b4a1-477f-9620-6104f3b5787c","Type":"ContainerDied","Data":"4cda84113b962f08c4bf5b3c512b338b0432576385450b08562c957856d1a288"} Oct 03 14:49:09 crc kubenswrapper[4636]: I1003 14:49:09.185886 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqk2" event={"ID":"51ecaf38-b4a1-477f-9620-6104f3b5787c","Type":"ContainerStarted","Data":"0b199573970a6236608f95e138c58fe3326d6a42ac3c2476736341f27964e633"} Oct 03 14:49:10 crc kubenswrapper[4636]: I1003 14:49:10.196429 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqk2" event={"ID":"51ecaf38-b4a1-477f-9620-6104f3b5787c","Type":"ContainerStarted","Data":"9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f"} Oct 03 14:49:11 crc kubenswrapper[4636]: I1003 14:49:11.218856 4636 generic.go:334] "Generic (PLEG): container finished" podID="51ecaf38-b4a1-477f-9620-6104f3b5787c" containerID="9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f" exitCode=0 Oct 03 14:49:11 crc kubenswrapper[4636]: I1003 14:49:11.219135 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqk2" event={"ID":"51ecaf38-b4a1-477f-9620-6104f3b5787c","Type":"ContainerDied","Data":"9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f"} Oct 03 14:49:11 crc kubenswrapper[4636]: E1003 14:49:11.310511 4636 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51ecaf38_b4a1_477f_9620_6104f3b5787c.slice/crio-conmon-9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f.scope\": RecentStats: unable to find data in memory cache]" Oct 03 14:49:12 crc kubenswrapper[4636]: I1003 14:49:12.234949 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqk2" event={"ID":"51ecaf38-b4a1-477f-9620-6104f3b5787c","Type":"ContainerStarted","Data":"dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677"} Oct 03 14:49:12 crc kubenswrapper[4636]: I1003 14:49:12.261312 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bdqk2" podStartSLOduration=2.439023348 podStartE2EDuration="5.261292297s" podCreationTimestamp="2025-10-03 14:49:07 +0000 UTC" firstStartedPulling="2025-10-03 14:49:09.187319339 +0000 UTC m=+2899.046045586" lastFinishedPulling="2025-10-03 14:49:12.009588288 +0000 UTC m=+2901.868314535" observedRunningTime="2025-10-03 14:49:12.255808534 +0000 UTC m=+2902.114534791" watchObservedRunningTime="2025-10-03 14:49:12.261292297 +0000 UTC m=+2902.120018544" Oct 03 14:49:18 crc kubenswrapper[4636]: I1003 14:49:18.052817 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:18 crc kubenswrapper[4636]: I1003 14:49:18.054185 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:18 crc kubenswrapper[4636]: I1003 14:49:18.113639 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:18 crc kubenswrapper[4636]: I1003 14:49:18.328584 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:18 crc kubenswrapper[4636]: I1003 14:49:18.370303 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bdqk2"] Oct 03 14:49:20 crc kubenswrapper[4636]: I1003 14:49:20.299387 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bdqk2" podUID="51ecaf38-b4a1-477f-9620-6104f3b5787c" containerName="registry-server" containerID="cri-o://dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677" gracePeriod=2 Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:20.745996 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:20.941261 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-catalog-content\") pod \"51ecaf38-b4a1-477f-9620-6104f3b5787c\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:20.941356 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdszr\" (UniqueName: \"kubernetes.io/projected/51ecaf38-b4a1-477f-9620-6104f3b5787c-kube-api-access-xdszr\") pod \"51ecaf38-b4a1-477f-9620-6104f3b5787c\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:20.941434 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-utilities\") pod \"51ecaf38-b4a1-477f-9620-6104f3b5787c\" (UID: \"51ecaf38-b4a1-477f-9620-6104f3b5787c\") " Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:20.942807 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-utilities" (OuterVolumeSpecName: "utilities") pod "51ecaf38-b4a1-477f-9620-6104f3b5787c" (UID: "51ecaf38-b4a1-477f-9620-6104f3b5787c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:20.957214 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ecaf38-b4a1-477f-9620-6104f3b5787c-kube-api-access-xdszr" (OuterVolumeSpecName: "kube-api-access-xdszr") pod "51ecaf38-b4a1-477f-9620-6104f3b5787c" (UID: "51ecaf38-b4a1-477f-9620-6104f3b5787c"). InnerVolumeSpecName "kube-api-access-xdszr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.000349 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51ecaf38-b4a1-477f-9620-6104f3b5787c" (UID: "51ecaf38-b4a1-477f-9620-6104f3b5787c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.043795 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.043820 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdszr\" (UniqueName: \"kubernetes.io/projected/51ecaf38-b4a1-477f-9620-6104f3b5787c-kube-api-access-xdszr\") on node \"crc\" DevicePath \"\"" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.043833 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ecaf38-b4a1-477f-9620-6104f3b5787c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.308580 4636 generic.go:334] "Generic (PLEG): container finished" podID="51ecaf38-b4a1-477f-9620-6104f3b5787c" containerID="dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677" exitCode=0 Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.308618 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdqk2" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.308632 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqk2" event={"ID":"51ecaf38-b4a1-477f-9620-6104f3b5787c","Type":"ContainerDied","Data":"dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677"} Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.310205 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqk2" event={"ID":"51ecaf38-b4a1-477f-9620-6104f3b5787c","Type":"ContainerDied","Data":"0b199573970a6236608f95e138c58fe3326d6a42ac3c2476736341f27964e633"} Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.310253 4636 scope.go:117] "RemoveContainer" containerID="dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.353595 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bdqk2"] Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.354221 4636 scope.go:117] "RemoveContainer" containerID="9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.364488 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bdqk2"] Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.385807 4636 scope.go:117] "RemoveContainer" containerID="4cda84113b962f08c4bf5b3c512b338b0432576385450b08562c957856d1a288" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.431807 4636 scope.go:117] "RemoveContainer" containerID="dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677" Oct 03 14:49:21 crc kubenswrapper[4636]: E1003 14:49:21.433218 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677\": container with ID starting with dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677 not found: ID does not exist" containerID="dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.433265 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677"} err="failed to get container status \"dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677\": rpc error: code = NotFound desc = could not find container \"dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677\": container with ID starting with dddc364fcb9fc5b5f85d56b1027231e0de9ac07d0dca7655dfc2d97b8e88f677 not found: ID does not exist" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.433295 4636 scope.go:117] "RemoveContainer" containerID="9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f" Oct 03 14:49:21 crc kubenswrapper[4636]: E1003 14:49:21.433681 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f\": container with ID starting with 9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f not found: ID does not exist" containerID="9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.433777 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f"} err="failed to get container status \"9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f\": rpc error: code = NotFound desc = could not find container \"9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f\": container with ID starting with 9d2fb431392b67170347736d62d7e2abb283d7b907a87703d08be5874b8c662f not found: ID does not exist" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.433850 4636 scope.go:117] "RemoveContainer" containerID="4cda84113b962f08c4bf5b3c512b338b0432576385450b08562c957856d1a288" Oct 03 14:49:21 crc kubenswrapper[4636]: E1003 14:49:21.434154 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cda84113b962f08c4bf5b3c512b338b0432576385450b08562c957856d1a288\": container with ID starting with 4cda84113b962f08c4bf5b3c512b338b0432576385450b08562c957856d1a288 not found: ID does not exist" containerID="4cda84113b962f08c4bf5b3c512b338b0432576385450b08562c957856d1a288" Oct 03 14:49:21 crc kubenswrapper[4636]: I1003 14:49:21.434186 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cda84113b962f08c4bf5b3c512b338b0432576385450b08562c957856d1a288"} err="failed to get container status \"4cda84113b962f08c4bf5b3c512b338b0432576385450b08562c957856d1a288\": rpc error: code = NotFound desc = could not find container \"4cda84113b962f08c4bf5b3c512b338b0432576385450b08562c957856d1a288\": container with ID starting with 4cda84113b962f08c4bf5b3c512b338b0432576385450b08562c957856d1a288 not found: ID does not exist" Oct 03 14:49:22 crc kubenswrapper[4636]: I1003 14:49:22.804322 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ecaf38-b4a1-477f-9620-6104f3b5787c" path="/var/lib/kubelet/pods/51ecaf38-b4a1-477f-9620-6104f3b5787c/volumes" Oct 03 14:49:39 crc kubenswrapper[4636]: I1003 14:49:39.162948 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:49:39 crc kubenswrapper[4636]: I1003 14:49:39.163474 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:49:39 crc kubenswrapper[4636]: I1003 14:49:39.163519 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:49:39 crc kubenswrapper[4636]: I1003 14:49:39.163990 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:49:39 crc kubenswrapper[4636]: I1003 14:49:39.164045 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" gracePeriod=600 Oct 03 14:49:39 crc kubenswrapper[4636]: E1003 14:49:39.281966 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:49:39 crc kubenswrapper[4636]: I1003 14:49:39.469054 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" exitCode=0 Oct 03 14:49:39 crc kubenswrapper[4636]: I1003 14:49:39.469133 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9"} Oct 03 14:49:39 crc kubenswrapper[4636]: I1003 14:49:39.469186 4636 scope.go:117] "RemoveContainer" containerID="9b6189a9993a988a8228eca73a7aab98395784c64925bc5bfe9f78c4defdcb4d" Oct 03 14:49:39 crc kubenswrapper[4636]: I1003 14:49:39.469709 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:49:39 crc kubenswrapper[4636]: E1003 14:49:39.470055 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:49:53 crc kubenswrapper[4636]: I1003 14:49:53.793973 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:49:53 crc kubenswrapper[4636]: E1003 14:49:53.794785 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:50:06 crc kubenswrapper[4636]: I1003 14:50:06.793547 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:50:06 crc kubenswrapper[4636]: E1003 14:50:06.794290 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:50:18 crc kubenswrapper[4636]: I1003 14:50:18.794234 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:50:18 crc kubenswrapper[4636]: E1003 14:50:18.794966 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:50:30 crc kubenswrapper[4636]: I1003 14:50:30.802113 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:50:30 crc kubenswrapper[4636]: E1003 14:50:30.804089 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:50:45 crc kubenswrapper[4636]: I1003 14:50:45.794139 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:50:45 crc kubenswrapper[4636]: E1003 14:50:45.794772 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:51:00 crc kubenswrapper[4636]: I1003 14:51:00.800761 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:51:00 crc kubenswrapper[4636]: E1003 14:51:00.801571 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:51:13 crc kubenswrapper[4636]: I1003 14:51:13.794715 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:51:13 crc kubenswrapper[4636]: E1003 14:51:13.796055 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:51:24 crc kubenswrapper[4636]: I1003 14:51:24.794902 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:51:24 crc kubenswrapper[4636]: E1003 14:51:24.795921 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:51:37 crc kubenswrapper[4636]: I1003 14:51:37.794586 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:51:37 crc kubenswrapper[4636]: E1003 14:51:37.795446 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:51:51 crc kubenswrapper[4636]: I1003 14:51:51.794257 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:51:51 crc kubenswrapper[4636]: E1003 14:51:51.794847 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:52:03 crc kubenswrapper[4636]: I1003 14:52:03.794592 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:52:03 crc kubenswrapper[4636]: E1003 14:52:03.795459 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:52:18 crc kubenswrapper[4636]: I1003 14:52:18.794163 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:52:18 crc kubenswrapper[4636]: E1003 14:52:18.794715 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:52:29 crc kubenswrapper[4636]: I1003 14:52:29.037531 4636 generic.go:334] "Generic (PLEG): container finished" podID="88e3290e-0c0d-4304-bcd2-b500068dc443" containerID="86c415f79a5609ed284b044e3d6de0b99f7c2d463a15c508586d2b720ae0d7ab" exitCode=0 Oct 03 14:52:29 crc kubenswrapper[4636]: I1003 14:52:29.037599 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" event={"ID":"88e3290e-0c0d-4304-bcd2-b500068dc443","Type":"ContainerDied","Data":"86c415f79a5609ed284b044e3d6de0b99f7c2d463a15c508586d2b720ae0d7ab"} Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.616084 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.663029 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-1\") pod \"88e3290e-0c0d-4304-bcd2-b500068dc443\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.663136 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-inventory\") pod \"88e3290e-0c0d-4304-bcd2-b500068dc443\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.663172 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzc2n\" (UniqueName: \"kubernetes.io/projected/88e3290e-0c0d-4304-bcd2-b500068dc443-kube-api-access-jzc2n\") pod \"88e3290e-0c0d-4304-bcd2-b500068dc443\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.663255 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-2\") pod \"88e3290e-0c0d-4304-bcd2-b500068dc443\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.663316 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-telemetry-combined-ca-bundle\") pod \"88e3290e-0c0d-4304-bcd2-b500068dc443\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.663350 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-0\") pod \"88e3290e-0c0d-4304-bcd2-b500068dc443\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.663423 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ssh-key\") pod \"88e3290e-0c0d-4304-bcd2-b500068dc443\" (UID: \"88e3290e-0c0d-4304-bcd2-b500068dc443\") " Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.671596 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "88e3290e-0c0d-4304-bcd2-b500068dc443" (UID: "88e3290e-0c0d-4304-bcd2-b500068dc443"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.674081 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e3290e-0c0d-4304-bcd2-b500068dc443-kube-api-access-jzc2n" (OuterVolumeSpecName: "kube-api-access-jzc2n") pod "88e3290e-0c0d-4304-bcd2-b500068dc443" (UID: "88e3290e-0c0d-4304-bcd2-b500068dc443"). InnerVolumeSpecName "kube-api-access-jzc2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.697837 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "88e3290e-0c0d-4304-bcd2-b500068dc443" (UID: "88e3290e-0c0d-4304-bcd2-b500068dc443"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.704863 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-inventory" (OuterVolumeSpecName: "inventory") pod "88e3290e-0c0d-4304-bcd2-b500068dc443" (UID: "88e3290e-0c0d-4304-bcd2-b500068dc443"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.707888 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "88e3290e-0c0d-4304-bcd2-b500068dc443" (UID: "88e3290e-0c0d-4304-bcd2-b500068dc443"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.709436 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "88e3290e-0c0d-4304-bcd2-b500068dc443" (UID: "88e3290e-0c0d-4304-bcd2-b500068dc443"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.731392 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "88e3290e-0c0d-4304-bcd2-b500068dc443" (UID: "88e3290e-0c0d-4304-bcd2-b500068dc443"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.766319 4636 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.766358 4636 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.766373 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzc2n\" (UniqueName: \"kubernetes.io/projected/88e3290e-0c0d-4304-bcd2-b500068dc443-kube-api-access-jzc2n\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.766388 4636 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.766400 4636 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.766412 4636 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:30 crc kubenswrapper[4636]: I1003 14:52:30.766422 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88e3290e-0c0d-4304-bcd2-b500068dc443-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 14:52:31 crc kubenswrapper[4636]: I1003 14:52:31.054956 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" event={"ID":"88e3290e-0c0d-4304-bcd2-b500068dc443","Type":"ContainerDied","Data":"de628da2f3424a1066ce35ea101557cbf7db6ee80e20339fc8c7437474a42538"} Oct 03 14:52:31 crc kubenswrapper[4636]: I1003 14:52:31.055277 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de628da2f3424a1066ce35ea101557cbf7db6ee80e20339fc8c7437474a42538" Oct 03 14:52:31 crc kubenswrapper[4636]: I1003 14:52:31.055000 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4njmj" Oct 03 14:52:31 crc kubenswrapper[4636]: I1003 14:52:31.794864 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:52:31 crc kubenswrapper[4636]: E1003 14:52:31.795178 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:52:42 crc kubenswrapper[4636]: I1003 14:52:42.794981 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:52:42 crc kubenswrapper[4636]: E1003 14:52:42.796924 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:52:56 crc kubenswrapper[4636]: I1003 14:52:56.793507 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:52:56 crc kubenswrapper[4636]: E1003 14:52:56.794233 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:53:10 crc kubenswrapper[4636]: I1003 14:53:10.799541 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:53:10 crc kubenswrapper[4636]: E1003 14:53:10.800663 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.476050 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 14:53:18 crc kubenswrapper[4636]: E1003 14:53:18.476914 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e3290e-0c0d-4304-bcd2-b500068dc443" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.476928 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e3290e-0c0d-4304-bcd2-b500068dc443" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 14:53:18 crc kubenswrapper[4636]: E1003 14:53:18.476943 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ecaf38-b4a1-477f-9620-6104f3b5787c" containerName="extract-utilities" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.476949 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ecaf38-b4a1-477f-9620-6104f3b5787c" containerName="extract-utilities" Oct 03 14:53:18 crc kubenswrapper[4636]: E1003 14:53:18.476966 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ecaf38-b4a1-477f-9620-6104f3b5787c" containerName="registry-server" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.476972 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ecaf38-b4a1-477f-9620-6104f3b5787c" containerName="registry-server" Oct 03 14:53:18 crc kubenswrapper[4636]: E1003 14:53:18.476999 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ecaf38-b4a1-477f-9620-6104f3b5787c" containerName="extract-content" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.477004 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ecaf38-b4a1-477f-9620-6104f3b5787c" containerName="extract-content" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.477186 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ecaf38-b4a1-477f-9620-6104f3b5787c" containerName="registry-server" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.477204 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e3290e-0c0d-4304-bcd2-b500068dc443" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.477763 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.481690 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-trj5l" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.481822 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.482217 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.483152 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.498440 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.568915 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.568959 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.569070 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.569203 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.569325 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.569850 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zhp\" (UniqueName: \"kubernetes.io/projected/76d391b3-cee3-4591-814b-a1b99bed1872-kube-api-access-d4zhp\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.569970 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.570091 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.570143 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-config-data\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.671956 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.672065 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.672158 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4zhp\" (UniqueName: \"kubernetes.io/projected/76d391b3-cee3-4591-814b-a1b99bed1872-kube-api-access-d4zhp\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.672210 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.672267 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.672298 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-config-data\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.672369 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.672404 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.672437 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.672605 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.672686 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.673622 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.673932 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-config-data\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.674214 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.678188 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.678822 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.689839 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.693484 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4zhp\" (UniqueName: \"kubernetes.io/projected/76d391b3-cee3-4591-814b-a1b99bed1872-kube-api-access-d4zhp\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.705176 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " pod="openstack/tempest-tests-tempest" Oct 03 14:53:18 crc kubenswrapper[4636]: I1003 14:53:18.800985 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 14:53:19 crc kubenswrapper[4636]: I1003 14:53:19.321676 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 14:53:19 crc kubenswrapper[4636]: I1003 14:53:19.340127 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:53:19 crc kubenswrapper[4636]: I1003 14:53:19.498015 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"76d391b3-cee3-4591-814b-a1b99bed1872","Type":"ContainerStarted","Data":"b27fb94c56f2a2c3b5bd262a8bf534138a99a0962c3eb32c0a66e87371ac8aab"} Oct 03 14:53:22 crc kubenswrapper[4636]: I1003 14:53:22.794584 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:53:22 crc kubenswrapper[4636]: E1003 14:53:22.795474 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:53:37 crc kubenswrapper[4636]: I1003 14:53:37.793843 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:53:37 crc kubenswrapper[4636]: E1003 14:53:37.794743 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:53:51 crc kubenswrapper[4636]: I1003 14:53:51.794429 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:53:51 crc kubenswrapper[4636]: E1003 14:53:51.795331 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:54:04 crc kubenswrapper[4636]: E1003 14:54:04.591069 4636 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 03 14:54:04 crc kubenswrapper[4636]: E1003 14:54:04.591701 4636 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4zhp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(76d391b3-cee3-4591-814b-a1b99bed1872): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 14:54:04 crc kubenswrapper[4636]: E1003 14:54:04.592940 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="76d391b3-cee3-4591-814b-a1b99bed1872" Oct 03 14:54:04 crc kubenswrapper[4636]: E1003 14:54:04.952710 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="76d391b3-cee3-4591-814b-a1b99bed1872" Oct 03 14:54:06 crc kubenswrapper[4636]: I1003 14:54:06.800024 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:54:06 crc kubenswrapper[4636]: E1003 14:54:06.800425 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:54:18 crc kubenswrapper[4636]: I1003 14:54:18.654348 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 03 14:54:20 crc kubenswrapper[4636]: I1003 14:54:20.109839 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"76d391b3-cee3-4591-814b-a1b99bed1872","Type":"ContainerStarted","Data":"e5f489f65481472cf5eef3b21310941d6599e7df420d8d6207e2e83bd20d6cc6"} Oct 03 14:54:20 crc kubenswrapper[4636]: I1003 14:54:20.138538 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.827421696 podStartE2EDuration="1m3.138515516s" podCreationTimestamp="2025-10-03 14:53:17 +0000 UTC" firstStartedPulling="2025-10-03 14:53:19.339876256 +0000 UTC m=+3149.198602503" lastFinishedPulling="2025-10-03 14:54:18.650970056 +0000 UTC m=+3208.509696323" observedRunningTime="2025-10-03 14:54:20.126845936 +0000 UTC m=+3209.985572183" watchObservedRunningTime="2025-10-03 14:54:20.138515516 +0000 UTC m=+3209.997241763" Oct 03 14:54:20 crc kubenswrapper[4636]: I1003 14:54:20.824350 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:54:20 crc kubenswrapper[4636]: E1003 14:54:20.825036 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:54:31 crc kubenswrapper[4636]: I1003 14:54:31.796192 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:54:31 crc kubenswrapper[4636]: E1003 14:54:31.797419 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.179262 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bs2ts"] Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.183173 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.199876 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs2ts"] Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.310132 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-utilities\") pod \"redhat-operators-bs2ts\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.310503 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-catalog-content\") pod \"redhat-operators-bs2ts\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.310571 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2858\" (UniqueName: \"kubernetes.io/projected/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-kube-api-access-d2858\") pod \"redhat-operators-bs2ts\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.412816 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-utilities\") pod \"redhat-operators-bs2ts\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.412958 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-catalog-content\") pod \"redhat-operators-bs2ts\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.413070 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2858\" (UniqueName: \"kubernetes.io/projected/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-kube-api-access-d2858\") pod \"redhat-operators-bs2ts\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.413483 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-utilities\") pod \"redhat-operators-bs2ts\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.413663 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-catalog-content\") pod \"redhat-operators-bs2ts\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.431541 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2858\" (UniqueName: \"kubernetes.io/projected/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-kube-api-access-d2858\") pod \"redhat-operators-bs2ts\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:41 crc kubenswrapper[4636]: I1003 14:54:41.503044 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:42 crc kubenswrapper[4636]: I1003 14:54:41.999738 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs2ts"] Oct 03 14:54:42 crc kubenswrapper[4636]: I1003 14:54:42.301082 4636 generic.go:334] "Generic (PLEG): container finished" podID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerID="457c3880561c363482cea1820224754ecc6a80f2ab9d720dac0939fb3d9cb2fb" exitCode=0 Oct 03 14:54:42 crc kubenswrapper[4636]: I1003 14:54:42.301435 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2ts" event={"ID":"6d9ee5c3-70df-4042-b54d-85c49a2db9e1","Type":"ContainerDied","Data":"457c3880561c363482cea1820224754ecc6a80f2ab9d720dac0939fb3d9cb2fb"} Oct 03 14:54:42 crc kubenswrapper[4636]: I1003 14:54:42.301458 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2ts" event={"ID":"6d9ee5c3-70df-4042-b54d-85c49a2db9e1","Type":"ContainerStarted","Data":"bc244c878af576a97b80ffe75e32e00cfdbb96aa2345bcfad6147491e20d68e7"} Oct 03 14:54:44 crc kubenswrapper[4636]: I1003 14:54:44.326601 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2ts" event={"ID":"6d9ee5c3-70df-4042-b54d-85c49a2db9e1","Type":"ContainerStarted","Data":"653cc7f06bf945ce652bfe0251df158bc5677613f7e4671d171a5604512f5565"} Oct 03 14:54:45 crc kubenswrapper[4636]: I1003 14:54:45.799262 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:54:46 crc kubenswrapper[4636]: I1003 14:54:46.347294 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"67ba3388c355a3fdeaa7cdee1d86d23b98df73b28126852c58f76489bd32226f"} Oct 03 14:54:47 crc kubenswrapper[4636]: I1003 14:54:47.358289 4636 generic.go:334] "Generic (PLEG): container finished" podID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerID="653cc7f06bf945ce652bfe0251df158bc5677613f7e4671d171a5604512f5565" exitCode=0 Oct 03 14:54:47 crc kubenswrapper[4636]: I1003 14:54:47.358488 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2ts" event={"ID":"6d9ee5c3-70df-4042-b54d-85c49a2db9e1","Type":"ContainerDied","Data":"653cc7f06bf945ce652bfe0251df158bc5677613f7e4671d171a5604512f5565"} Oct 03 14:54:48 crc kubenswrapper[4636]: I1003 14:54:48.368718 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2ts" event={"ID":"6d9ee5c3-70df-4042-b54d-85c49a2db9e1","Type":"ContainerStarted","Data":"e6b4d3f01c2c728d8bdf390a4c77b51c6231b4a01e2e8f9f0f39a7d0ce8e81e1"} Oct 03 14:54:48 crc kubenswrapper[4636]: I1003 14:54:48.387551 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bs2ts" podStartSLOduration=1.909553071 podStartE2EDuration="7.387527803s" podCreationTimestamp="2025-10-03 14:54:41 +0000 UTC" firstStartedPulling="2025-10-03 14:54:42.304219272 +0000 UTC m=+3232.162945519" lastFinishedPulling="2025-10-03 14:54:47.782194014 +0000 UTC m=+3237.640920251" observedRunningTime="2025-10-03 14:54:48.383698261 +0000 UTC m=+3238.242424508" watchObservedRunningTime="2025-10-03 14:54:48.387527803 +0000 UTC m=+3238.246254050" Oct 03 14:54:51 crc kubenswrapper[4636]: I1003 14:54:51.505492 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:51 crc kubenswrapper[4636]: I1003 14:54:51.506219 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:54:52 crc kubenswrapper[4636]: I1003 14:54:52.569528 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs2ts" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="registry-server" probeResult="failure" output=< Oct 03 14:54:52 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 14:54:52 crc kubenswrapper[4636]: > Oct 03 14:55:02 crc kubenswrapper[4636]: I1003 14:55:02.585884 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs2ts" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="registry-server" probeResult="failure" output=< Oct 03 14:55:02 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 14:55:02 crc kubenswrapper[4636]: > Oct 03 14:55:12 crc kubenswrapper[4636]: I1003 14:55:12.563637 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs2ts" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="registry-server" probeResult="failure" output=< Oct 03 14:55:12 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 14:55:12 crc kubenswrapper[4636]: > Oct 03 14:55:22 crc kubenswrapper[4636]: I1003 14:55:22.556865 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs2ts" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="registry-server" probeResult="failure" output=< Oct 03 14:55:22 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 14:55:22 crc kubenswrapper[4636]: > Oct 03 14:55:31 crc kubenswrapper[4636]: I1003 14:55:31.563743 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:55:31 crc kubenswrapper[4636]: I1003 14:55:31.620177 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:55:34 crc kubenswrapper[4636]: I1003 14:55:34.133321 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bs2ts"] Oct 03 14:55:34 crc kubenswrapper[4636]: I1003 14:55:34.133783 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bs2ts" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="registry-server" containerID="cri-o://e6b4d3f01c2c728d8bdf390a4c77b51c6231b4a01e2e8f9f0f39a7d0ce8e81e1" gracePeriod=2 Oct 03 14:55:34 crc kubenswrapper[4636]: I1003 14:55:34.772296 4636 generic.go:334] "Generic (PLEG): container finished" podID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerID="e6b4d3f01c2c728d8bdf390a4c77b51c6231b4a01e2e8f9f0f39a7d0ce8e81e1" exitCode=0 Oct 03 14:55:34 crc kubenswrapper[4636]: I1003 14:55:34.772349 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2ts" event={"ID":"6d9ee5c3-70df-4042-b54d-85c49a2db9e1","Type":"ContainerDied","Data":"e6b4d3f01c2c728d8bdf390a4c77b51c6231b4a01e2e8f9f0f39a7d0ce8e81e1"} Oct 03 14:55:34 crc kubenswrapper[4636]: I1003 14:55:34.772676 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2ts" event={"ID":"6d9ee5c3-70df-4042-b54d-85c49a2db9e1","Type":"ContainerDied","Data":"bc244c878af576a97b80ffe75e32e00cfdbb96aa2345bcfad6147491e20d68e7"} Oct 03 14:55:34 crc kubenswrapper[4636]: I1003 14:55:34.772699 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc244c878af576a97b80ffe75e32e00cfdbb96aa2345bcfad6147491e20d68e7" Oct 03 14:55:34 crc kubenswrapper[4636]: I1003 14:55:34.845192 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.012818 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-catalog-content\") pod \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.013192 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2858\" (UniqueName: \"kubernetes.io/projected/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-kube-api-access-d2858\") pod \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.013433 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-utilities\") pod \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\" (UID: \"6d9ee5c3-70df-4042-b54d-85c49a2db9e1\") " Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.013959 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-utilities" (OuterVolumeSpecName: "utilities") pod "6d9ee5c3-70df-4042-b54d-85c49a2db9e1" (UID: "6d9ee5c3-70df-4042-b54d-85c49a2db9e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.024349 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-kube-api-access-d2858" (OuterVolumeSpecName: "kube-api-access-d2858") pod "6d9ee5c3-70df-4042-b54d-85c49a2db9e1" (UID: "6d9ee5c3-70df-4042-b54d-85c49a2db9e1"). InnerVolumeSpecName "kube-api-access-d2858". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.076392 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d9ee5c3-70df-4042-b54d-85c49a2db9e1" (UID: "6d9ee5c3-70df-4042-b54d-85c49a2db9e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.115753 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.115787 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2858\" (UniqueName: \"kubernetes.io/projected/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-kube-api-access-d2858\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.115800 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9ee5c3-70df-4042-b54d-85c49a2db9e1-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.780608 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs2ts" Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.817117 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bs2ts"] Oct 03 14:55:35 crc kubenswrapper[4636]: I1003 14:55:35.839861 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bs2ts"] Oct 03 14:55:36 crc kubenswrapper[4636]: I1003 14:55:36.803157 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" path="/var/lib/kubelet/pods/6d9ee5c3-70df-4042-b54d-85c49a2db9e1/volumes" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.050336 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nsx29"] Oct 03 14:56:25 crc kubenswrapper[4636]: E1003 14:56:25.052145 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="registry-server" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.052327 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="registry-server" Oct 03 14:56:25 crc kubenswrapper[4636]: E1003 14:56:25.052392 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="extract-utilities" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.052444 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="extract-utilities" Oct 03 14:56:25 crc kubenswrapper[4636]: E1003 14:56:25.052511 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="extract-content" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.052562 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="extract-content" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.052800 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9ee5c3-70df-4042-b54d-85c49a2db9e1" containerName="registry-server" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.054266 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.065656 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nsx29"] Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.107069 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-522d6\" (UniqueName: \"kubernetes.io/projected/1e315352-8bba-4fea-8fad-e88f718ca0a8-kube-api-access-522d6\") pod \"certified-operators-nsx29\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.107134 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-utilities\") pod \"certified-operators-nsx29\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.107247 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-catalog-content\") pod \"certified-operators-nsx29\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.209000 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-catalog-content\") pod \"certified-operators-nsx29\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.209130 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-522d6\" (UniqueName: \"kubernetes.io/projected/1e315352-8bba-4fea-8fad-e88f718ca0a8-kube-api-access-522d6\") pod \"certified-operators-nsx29\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.209158 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-utilities\") pod \"certified-operators-nsx29\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.209710 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-catalog-content\") pod \"certified-operators-nsx29\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.209728 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-utilities\") pod \"certified-operators-nsx29\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.230948 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-522d6\" (UniqueName: \"kubernetes.io/projected/1e315352-8bba-4fea-8fad-e88f718ca0a8-kube-api-access-522d6\") pod \"certified-operators-nsx29\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.387013 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:25 crc kubenswrapper[4636]: I1003 14:56:25.978809 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nsx29"] Oct 03 14:56:26 crc kubenswrapper[4636]: I1003 14:56:26.220465 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsx29" event={"ID":"1e315352-8bba-4fea-8fad-e88f718ca0a8","Type":"ContainerStarted","Data":"162ba507b4303e9b71680555120906540d33ebbb58bd42e68000f39e47597c78"} Oct 03 14:56:27 crc kubenswrapper[4636]: I1003 14:56:27.230214 4636 generic.go:334] "Generic (PLEG): container finished" podID="1e315352-8bba-4fea-8fad-e88f718ca0a8" containerID="76d37f376a8b98b1fbf62bf8865e995a2240ed9fd271bb1428c0f5e0548d4acc" exitCode=0 Oct 03 14:56:27 crc kubenswrapper[4636]: I1003 14:56:27.230312 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsx29" event={"ID":"1e315352-8bba-4fea-8fad-e88f718ca0a8","Type":"ContainerDied","Data":"76d37f376a8b98b1fbf62bf8865e995a2240ed9fd271bb1428c0f5e0548d4acc"} Oct 03 14:56:28 crc kubenswrapper[4636]: I1003 14:56:28.240736 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsx29" event={"ID":"1e315352-8bba-4fea-8fad-e88f718ca0a8","Type":"ContainerStarted","Data":"a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8"} Oct 03 14:56:30 crc kubenswrapper[4636]: I1003 14:56:30.262248 4636 generic.go:334] "Generic (PLEG): container finished" podID="1e315352-8bba-4fea-8fad-e88f718ca0a8" containerID="a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8" exitCode=0 Oct 03 14:56:30 crc kubenswrapper[4636]: I1003 14:56:30.262335 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsx29" event={"ID":"1e315352-8bba-4fea-8fad-e88f718ca0a8","Type":"ContainerDied","Data":"a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8"} Oct 03 14:56:31 crc kubenswrapper[4636]: I1003 14:56:31.273161 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsx29" event={"ID":"1e315352-8bba-4fea-8fad-e88f718ca0a8","Type":"ContainerStarted","Data":"bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b"} Oct 03 14:56:35 crc kubenswrapper[4636]: I1003 14:56:35.387868 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:35 crc kubenswrapper[4636]: I1003 14:56:35.388444 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:35 crc kubenswrapper[4636]: I1003 14:56:35.436548 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:35 crc kubenswrapper[4636]: I1003 14:56:35.460135 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nsx29" podStartSLOduration=6.994482383 podStartE2EDuration="10.460089739s" podCreationTimestamp="2025-10-03 14:56:25 +0000 UTC" firstStartedPulling="2025-10-03 14:56:27.232642509 +0000 UTC m=+3337.091368756" lastFinishedPulling="2025-10-03 14:56:30.698249865 +0000 UTC m=+3340.556976112" observedRunningTime="2025-10-03 14:56:31.294419412 +0000 UTC m=+3341.153145659" watchObservedRunningTime="2025-10-03 14:56:35.460089739 +0000 UTC m=+3345.318816006" Oct 03 14:56:36 crc kubenswrapper[4636]: I1003 14:56:36.375877 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:36 crc kubenswrapper[4636]: I1003 14:56:36.426692 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nsx29"] Oct 03 14:56:38 crc kubenswrapper[4636]: I1003 14:56:38.330041 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nsx29" podUID="1e315352-8bba-4fea-8fad-e88f718ca0a8" containerName="registry-server" containerID="cri-o://bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b" gracePeriod=2 Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.055943 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.221480 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-utilities\") pod \"1e315352-8bba-4fea-8fad-e88f718ca0a8\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.221603 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-catalog-content\") pod \"1e315352-8bba-4fea-8fad-e88f718ca0a8\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.221710 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-522d6\" (UniqueName: \"kubernetes.io/projected/1e315352-8bba-4fea-8fad-e88f718ca0a8-kube-api-access-522d6\") pod \"1e315352-8bba-4fea-8fad-e88f718ca0a8\" (UID: \"1e315352-8bba-4fea-8fad-e88f718ca0a8\") " Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.222550 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-utilities" (OuterVolumeSpecName: "utilities") pod "1e315352-8bba-4fea-8fad-e88f718ca0a8" (UID: "1e315352-8bba-4fea-8fad-e88f718ca0a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.228304 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e315352-8bba-4fea-8fad-e88f718ca0a8-kube-api-access-522d6" (OuterVolumeSpecName: "kube-api-access-522d6") pod "1e315352-8bba-4fea-8fad-e88f718ca0a8" (UID: "1e315352-8bba-4fea-8fad-e88f718ca0a8"). InnerVolumeSpecName "kube-api-access-522d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.266860 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e315352-8bba-4fea-8fad-e88f718ca0a8" (UID: "1e315352-8bba-4fea-8fad-e88f718ca0a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.324188 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.324217 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-522d6\" (UniqueName: \"kubernetes.io/projected/1e315352-8bba-4fea-8fad-e88f718ca0a8-kube-api-access-522d6\") on node \"crc\" DevicePath \"\"" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.324228 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e315352-8bba-4fea-8fad-e88f718ca0a8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.353185 4636 generic.go:334] "Generic (PLEG): container finished" podID="1e315352-8bba-4fea-8fad-e88f718ca0a8" containerID="bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b" exitCode=0 Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.353227 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsx29" event={"ID":"1e315352-8bba-4fea-8fad-e88f718ca0a8","Type":"ContainerDied","Data":"bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b"} Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.353260 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsx29" event={"ID":"1e315352-8bba-4fea-8fad-e88f718ca0a8","Type":"ContainerDied","Data":"162ba507b4303e9b71680555120906540d33ebbb58bd42e68000f39e47597c78"} Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.353279 4636 scope.go:117] "RemoveContainer" containerID="bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.353505 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsx29" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.377906 4636 scope.go:117] "RemoveContainer" containerID="a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.403785 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nsx29"] Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.411936 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nsx29"] Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.415396 4636 scope.go:117] "RemoveContainer" containerID="76d37f376a8b98b1fbf62bf8865e995a2240ed9fd271bb1428c0f5e0548d4acc" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.453142 4636 scope.go:117] "RemoveContainer" containerID="bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b" Oct 03 14:56:39 crc kubenswrapper[4636]: E1003 14:56:39.453732 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b\": container with ID starting with bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b not found: ID does not exist" containerID="bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.453810 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b"} err="failed to get container status \"bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b\": rpc error: code = NotFound desc = could not find container \"bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b\": container with ID starting with bbcd9378ddabaee5c5a719e01679024e61df2a4c3f4cf1914e08efefe2e7466b not found: ID does not exist" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.453867 4636 scope.go:117] "RemoveContainer" containerID="a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8" Oct 03 14:56:39 crc kubenswrapper[4636]: E1003 14:56:39.454342 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8\": container with ID starting with a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8 not found: ID does not exist" containerID="a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.454371 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8"} err="failed to get container status \"a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8\": rpc error: code = NotFound desc = could not find container \"a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8\": container with ID starting with a282237b85427269e9db462c4cce103604b0d0f111d2d0cb3e30085d890bbae8 not found: ID does not exist" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.454388 4636 scope.go:117] "RemoveContainer" containerID="76d37f376a8b98b1fbf62bf8865e995a2240ed9fd271bb1428c0f5e0548d4acc" Oct 03 14:56:39 crc kubenswrapper[4636]: E1003 14:56:39.454869 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d37f376a8b98b1fbf62bf8865e995a2240ed9fd271bb1428c0f5e0548d4acc\": container with ID starting with 76d37f376a8b98b1fbf62bf8865e995a2240ed9fd271bb1428c0f5e0548d4acc not found: ID does not exist" containerID="76d37f376a8b98b1fbf62bf8865e995a2240ed9fd271bb1428c0f5e0548d4acc" Oct 03 14:56:39 crc kubenswrapper[4636]: I1003 14:56:39.454911 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d37f376a8b98b1fbf62bf8865e995a2240ed9fd271bb1428c0f5e0548d4acc"} err="failed to get container status \"76d37f376a8b98b1fbf62bf8865e995a2240ed9fd271bb1428c0f5e0548d4acc\": rpc error: code = NotFound desc = could not find container \"76d37f376a8b98b1fbf62bf8865e995a2240ed9fd271bb1428c0f5e0548d4acc\": container with ID starting with 76d37f376a8b98b1fbf62bf8865e995a2240ed9fd271bb1428c0f5e0548d4acc not found: ID does not exist" Oct 03 14:56:40 crc kubenswrapper[4636]: I1003 14:56:40.807924 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e315352-8bba-4fea-8fad-e88f718ca0a8" path="/var/lib/kubelet/pods/1e315352-8bba-4fea-8fad-e88f718ca0a8/volumes" Oct 03 14:57:09 crc kubenswrapper[4636]: I1003 14:57:09.162704 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:57:09 crc kubenswrapper[4636]: I1003 14:57:09.164180 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:57:39 crc kubenswrapper[4636]: I1003 14:57:39.162945 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:57:39 crc kubenswrapper[4636]: I1003 14:57:39.163560 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:58:09 crc kubenswrapper[4636]: I1003 14:58:09.162913 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 14:58:09 crc kubenswrapper[4636]: I1003 14:58:09.164244 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 14:58:09 crc kubenswrapper[4636]: I1003 14:58:09.164348 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 14:58:09 crc kubenswrapper[4636]: I1003 14:58:09.165066 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67ba3388c355a3fdeaa7cdee1d86d23b98df73b28126852c58f76489bd32226f"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 14:58:09 crc kubenswrapper[4636]: I1003 14:58:09.165206 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://67ba3388c355a3fdeaa7cdee1d86d23b98df73b28126852c58f76489bd32226f" gracePeriod=600 Oct 03 14:58:10 crc kubenswrapper[4636]: I1003 14:58:10.159054 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="67ba3388c355a3fdeaa7cdee1d86d23b98df73b28126852c58f76489bd32226f" exitCode=0 Oct 03 14:58:10 crc kubenswrapper[4636]: I1003 14:58:10.159588 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"67ba3388c355a3fdeaa7cdee1d86d23b98df73b28126852c58f76489bd32226f"} Oct 03 14:58:10 crc kubenswrapper[4636]: I1003 14:58:10.159617 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb"} Oct 03 14:58:10 crc kubenswrapper[4636]: I1003 14:58:10.159633 4636 scope.go:117] "RemoveContainer" containerID="b22c92c0726fa0ddf436dd50c2f43747a4fd8e4bc9c75d69e8a0622faa77e6d9" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.412173 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mlcwr"] Oct 03 14:59:42 crc kubenswrapper[4636]: E1003 14:59:42.414387 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e315352-8bba-4fea-8fad-e88f718ca0a8" containerName="extract-utilities" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.414497 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e315352-8bba-4fea-8fad-e88f718ca0a8" containerName="extract-utilities" Oct 03 14:59:42 crc kubenswrapper[4636]: E1003 14:59:42.414589 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e315352-8bba-4fea-8fad-e88f718ca0a8" containerName="registry-server" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.414688 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e315352-8bba-4fea-8fad-e88f718ca0a8" containerName="registry-server" Oct 03 14:59:42 crc kubenswrapper[4636]: E1003 14:59:42.414756 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e315352-8bba-4fea-8fad-e88f718ca0a8" containerName="extract-content" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.414807 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e315352-8bba-4fea-8fad-e88f718ca0a8" containerName="extract-content" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.419287 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e315352-8bba-4fea-8fad-e88f718ca0a8" containerName="registry-server" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.422402 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mlcwr"] Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.422522 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.486785 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-catalog-content\") pod \"community-operators-mlcwr\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.486829 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv5hj\" (UniqueName: \"kubernetes.io/projected/ed463efa-bfe6-480e-b4bd-b35ad269095f-kube-api-access-tv5hj\") pod \"community-operators-mlcwr\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.486883 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-utilities\") pod \"community-operators-mlcwr\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.588971 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv5hj\" (UniqueName: \"kubernetes.io/projected/ed463efa-bfe6-480e-b4bd-b35ad269095f-kube-api-access-tv5hj\") pod \"community-operators-mlcwr\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.589087 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-utilities\") pod \"community-operators-mlcwr\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.589309 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-catalog-content\") pod \"community-operators-mlcwr\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.589813 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-catalog-content\") pod \"community-operators-mlcwr\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.590072 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-utilities\") pod \"community-operators-mlcwr\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.622238 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv5hj\" (UniqueName: \"kubernetes.io/projected/ed463efa-bfe6-480e-b4bd-b35ad269095f-kube-api-access-tv5hj\") pod \"community-operators-mlcwr\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:42 crc kubenswrapper[4636]: I1003 14:59:42.751021 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:43 crc kubenswrapper[4636]: I1003 14:59:43.339236 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mlcwr"] Oct 03 14:59:44 crc kubenswrapper[4636]: I1003 14:59:44.064942 4636 generic.go:334] "Generic (PLEG): container finished" podID="ed463efa-bfe6-480e-b4bd-b35ad269095f" containerID="d85ea82f0ab91997130a01cb8f5c293da9d149b7815fedea8833b039219eb272" exitCode=0 Oct 03 14:59:44 crc kubenswrapper[4636]: I1003 14:59:44.064998 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlcwr" event={"ID":"ed463efa-bfe6-480e-b4bd-b35ad269095f","Type":"ContainerDied","Data":"d85ea82f0ab91997130a01cb8f5c293da9d149b7815fedea8833b039219eb272"} Oct 03 14:59:44 crc kubenswrapper[4636]: I1003 14:59:44.065190 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlcwr" event={"ID":"ed463efa-bfe6-480e-b4bd-b35ad269095f","Type":"ContainerStarted","Data":"756abe8986f05a178ad3704693e390c2442988998f62f1ff4f225ad80aa3b708"} Oct 03 14:59:44 crc kubenswrapper[4636]: I1003 14:59:44.067418 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 14:59:45 crc kubenswrapper[4636]: I1003 14:59:45.073293 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlcwr" event={"ID":"ed463efa-bfe6-480e-b4bd-b35ad269095f","Type":"ContainerStarted","Data":"4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797"} Oct 03 14:59:47 crc kubenswrapper[4636]: I1003 14:59:47.091658 4636 generic.go:334] "Generic (PLEG): container finished" podID="ed463efa-bfe6-480e-b4bd-b35ad269095f" containerID="4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797" exitCode=0 Oct 03 14:59:47 crc kubenswrapper[4636]: I1003 14:59:47.091743 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlcwr" event={"ID":"ed463efa-bfe6-480e-b4bd-b35ad269095f","Type":"ContainerDied","Data":"4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797"} Oct 03 14:59:48 crc kubenswrapper[4636]: I1003 14:59:48.109180 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlcwr" event={"ID":"ed463efa-bfe6-480e-b4bd-b35ad269095f","Type":"ContainerStarted","Data":"b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727"} Oct 03 14:59:48 crc kubenswrapper[4636]: I1003 14:59:48.133365 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mlcwr" podStartSLOduration=2.575321443 podStartE2EDuration="6.133346364s" podCreationTimestamp="2025-10-03 14:59:42 +0000 UTC" firstStartedPulling="2025-10-03 14:59:44.067161065 +0000 UTC m=+3533.925887312" lastFinishedPulling="2025-10-03 14:59:47.625185986 +0000 UTC m=+3537.483912233" observedRunningTime="2025-10-03 14:59:48.132158892 +0000 UTC m=+3537.990885139" watchObservedRunningTime="2025-10-03 14:59:48.133346364 +0000 UTC m=+3537.992072611" Oct 03 14:59:52 crc kubenswrapper[4636]: I1003 14:59:52.751808 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:52 crc kubenswrapper[4636]: I1003 14:59:52.752389 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:52 crc kubenswrapper[4636]: I1003 14:59:52.807780 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:53 crc kubenswrapper[4636]: I1003 14:59:53.204927 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:53 crc kubenswrapper[4636]: I1003 14:59:53.259043 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mlcwr"] Oct 03 14:59:55 crc kubenswrapper[4636]: I1003 14:59:55.181450 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mlcwr" podUID="ed463efa-bfe6-480e-b4bd-b35ad269095f" containerName="registry-server" containerID="cri-o://b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727" gracePeriod=2 Oct 03 14:59:55 crc kubenswrapper[4636]: I1003 14:59:55.930117 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:55 crc kubenswrapper[4636]: I1003 14:59:55.984358 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-catalog-content\") pod \"ed463efa-bfe6-480e-b4bd-b35ad269095f\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " Oct 03 14:59:55 crc kubenswrapper[4636]: I1003 14:59:55.984541 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv5hj\" (UniqueName: \"kubernetes.io/projected/ed463efa-bfe6-480e-b4bd-b35ad269095f-kube-api-access-tv5hj\") pod \"ed463efa-bfe6-480e-b4bd-b35ad269095f\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " Oct 03 14:59:55 crc kubenswrapper[4636]: I1003 14:59:55.984560 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-utilities\") pod \"ed463efa-bfe6-480e-b4bd-b35ad269095f\" (UID: \"ed463efa-bfe6-480e-b4bd-b35ad269095f\") " Oct 03 14:59:55 crc kubenswrapper[4636]: I1003 14:59:55.985943 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-utilities" (OuterVolumeSpecName: "utilities") pod "ed463efa-bfe6-480e-b4bd-b35ad269095f" (UID: "ed463efa-bfe6-480e-b4bd-b35ad269095f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:59:55 crc kubenswrapper[4636]: I1003 14:59:55.994408 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed463efa-bfe6-480e-b4bd-b35ad269095f-kube-api-access-tv5hj" (OuterVolumeSpecName: "kube-api-access-tv5hj") pod "ed463efa-bfe6-480e-b4bd-b35ad269095f" (UID: "ed463efa-bfe6-480e-b4bd-b35ad269095f"). InnerVolumeSpecName "kube-api-access-tv5hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.045481 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed463efa-bfe6-480e-b4bd-b35ad269095f" (UID: "ed463efa-bfe6-480e-b4bd-b35ad269095f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.086778 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv5hj\" (UniqueName: \"kubernetes.io/projected/ed463efa-bfe6-480e-b4bd-b35ad269095f-kube-api-access-tv5hj\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.086818 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.086829 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed463efa-bfe6-480e-b4bd-b35ad269095f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.192542 4636 generic.go:334] "Generic (PLEG): container finished" podID="ed463efa-bfe6-480e-b4bd-b35ad269095f" containerID="b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727" exitCode=0 Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.192602 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlcwr" event={"ID":"ed463efa-bfe6-480e-b4bd-b35ad269095f","Type":"ContainerDied","Data":"b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727"} Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.192622 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlcwr" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.192643 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlcwr" event={"ID":"ed463efa-bfe6-480e-b4bd-b35ad269095f","Type":"ContainerDied","Data":"756abe8986f05a178ad3704693e390c2442988998f62f1ff4f225ad80aa3b708"} Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.192665 4636 scope.go:117] "RemoveContainer" containerID="b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.217446 4636 scope.go:117] "RemoveContainer" containerID="4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.231535 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mlcwr"] Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.240003 4636 scope.go:117] "RemoveContainer" containerID="d85ea82f0ab91997130a01cb8f5c293da9d149b7815fedea8833b039219eb272" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.260644 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mlcwr"] Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.296573 4636 scope.go:117] "RemoveContainer" containerID="b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727" Oct 03 14:59:56 crc kubenswrapper[4636]: E1003 14:59:56.296895 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727\": container with ID starting with b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727 not found: ID does not exist" containerID="b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.296923 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727"} err="failed to get container status \"b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727\": rpc error: code = NotFound desc = could not find container \"b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727\": container with ID starting with b9b86e95a5cc152a1b31dd650f86a82ee2f3e56de36be023fe5418fd00bc7727 not found: ID does not exist" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.296972 4636 scope.go:117] "RemoveContainer" containerID="4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797" Oct 03 14:59:56 crc kubenswrapper[4636]: E1003 14:59:56.297362 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797\": container with ID starting with 4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797 not found: ID does not exist" containerID="4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.297390 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797"} err="failed to get container status \"4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797\": rpc error: code = NotFound desc = could not find container \"4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797\": container with ID starting with 4175888e44ff035c6cc73bdfe38300ee2a11bf83986b4ee0436d1f2f8bc44797 not found: ID does not exist" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.297409 4636 scope.go:117] "RemoveContainer" containerID="d85ea82f0ab91997130a01cb8f5c293da9d149b7815fedea8833b039219eb272" Oct 03 14:59:56 crc kubenswrapper[4636]: E1003 14:59:56.297664 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85ea82f0ab91997130a01cb8f5c293da9d149b7815fedea8833b039219eb272\": container with ID starting with d85ea82f0ab91997130a01cb8f5c293da9d149b7815fedea8833b039219eb272 not found: ID does not exist" containerID="d85ea82f0ab91997130a01cb8f5c293da9d149b7815fedea8833b039219eb272" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.297685 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85ea82f0ab91997130a01cb8f5c293da9d149b7815fedea8833b039219eb272"} err="failed to get container status \"d85ea82f0ab91997130a01cb8f5c293da9d149b7815fedea8833b039219eb272\": rpc error: code = NotFound desc = could not find container \"d85ea82f0ab91997130a01cb8f5c293da9d149b7815fedea8833b039219eb272\": container with ID starting with d85ea82f0ab91997130a01cb8f5c293da9d149b7815fedea8833b039219eb272 not found: ID does not exist" Oct 03 14:59:56 crc kubenswrapper[4636]: I1003 14:59:56.803437 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed463efa-bfe6-480e-b4bd-b35ad269095f" path="/var/lib/kubelet/pods/ed463efa-bfe6-480e-b4bd-b35ad269095f/volumes" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.175362 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665"] Oct 03 15:00:00 crc kubenswrapper[4636]: E1003 15:00:00.176315 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed463efa-bfe6-480e-b4bd-b35ad269095f" containerName="registry-server" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.176333 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed463efa-bfe6-480e-b4bd-b35ad269095f" containerName="registry-server" Oct 03 15:00:00 crc kubenswrapper[4636]: E1003 15:00:00.176352 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed463efa-bfe6-480e-b4bd-b35ad269095f" containerName="extract-content" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.176376 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed463efa-bfe6-480e-b4bd-b35ad269095f" containerName="extract-content" Oct 03 15:00:00 crc kubenswrapper[4636]: E1003 15:00:00.176391 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed463efa-bfe6-480e-b4bd-b35ad269095f" containerName="extract-utilities" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.176399 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed463efa-bfe6-480e-b4bd-b35ad269095f" containerName="extract-utilities" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.176639 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed463efa-bfe6-480e-b4bd-b35ad269095f" containerName="registry-server" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.177367 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.180360 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.180751 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.189800 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665"] Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.273413 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-config-volume\") pod \"collect-profiles-29325060-gt665\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.273887 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-secret-volume\") pod \"collect-profiles-29325060-gt665\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.273946 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t9qh\" (UniqueName: \"kubernetes.io/projected/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-kube-api-access-5t9qh\") pod \"collect-profiles-29325060-gt665\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.375709 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t9qh\" (UniqueName: \"kubernetes.io/projected/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-kube-api-access-5t9qh\") pod \"collect-profiles-29325060-gt665\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.375884 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-config-volume\") pod \"collect-profiles-29325060-gt665\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.375920 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-secret-volume\") pod \"collect-profiles-29325060-gt665\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.376838 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-config-volume\") pod \"collect-profiles-29325060-gt665\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.394883 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-secret-volume\") pod \"collect-profiles-29325060-gt665\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.427051 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t9qh\" (UniqueName: \"kubernetes.io/projected/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-kube-api-access-5t9qh\") pod \"collect-profiles-29325060-gt665\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:00 crc kubenswrapper[4636]: I1003 15:00:00.508915 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:01 crc kubenswrapper[4636]: I1003 15:00:01.050909 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665"] Oct 03 15:00:01 crc kubenswrapper[4636]: I1003 15:00:01.241980 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" event={"ID":"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191","Type":"ContainerStarted","Data":"05ffd6ecb2d90162a59dad55f54cbcae0e2c729a0556a2599c611cfd5e0f539f"} Oct 03 15:00:01 crc kubenswrapper[4636]: I1003 15:00:01.242342 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" event={"ID":"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191","Type":"ContainerStarted","Data":"10a96646fe9f2ba5a059ea54a005eeafdf06327d6ad3cfc57081a7e37d5734f5"} Oct 03 15:00:01 crc kubenswrapper[4636]: I1003 15:00:01.262087 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" podStartSLOduration=1.262065947 podStartE2EDuration="1.262065947s" podCreationTimestamp="2025-10-03 15:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:00:01.257484717 +0000 UTC m=+3551.116210964" watchObservedRunningTime="2025-10-03 15:00:01.262065947 +0000 UTC m=+3551.120792194" Oct 03 15:00:02 crc kubenswrapper[4636]: I1003 15:00:02.254665 4636 generic.go:334] "Generic (PLEG): container finished" podID="e0f939e8-6b57-4d66-8b3d-c5a8b9df2191" containerID="05ffd6ecb2d90162a59dad55f54cbcae0e2c729a0556a2599c611cfd5e0f539f" exitCode=0 Oct 03 15:00:02 crc kubenswrapper[4636]: I1003 15:00:02.254782 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" event={"ID":"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191","Type":"ContainerDied","Data":"05ffd6ecb2d90162a59dad55f54cbcae0e2c729a0556a2599c611cfd5e0f539f"} Oct 03 15:00:03 crc kubenswrapper[4636]: I1003 15:00:03.833389 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:03 crc kubenswrapper[4636]: I1003 15:00:03.983707 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t9qh\" (UniqueName: \"kubernetes.io/projected/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-kube-api-access-5t9qh\") pod \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " Oct 03 15:00:03 crc kubenswrapper[4636]: I1003 15:00:03.983895 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-config-volume\") pod \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " Oct 03 15:00:03 crc kubenswrapper[4636]: I1003 15:00:03.983939 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-secret-volume\") pod \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\" (UID: \"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191\") " Oct 03 15:00:03 crc kubenswrapper[4636]: I1003 15:00:03.984904 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-config-volume" (OuterVolumeSpecName: "config-volume") pod "e0f939e8-6b57-4d66-8b3d-c5a8b9df2191" (UID: "e0f939e8-6b57-4d66-8b3d-c5a8b9df2191"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:00:03 crc kubenswrapper[4636]: I1003 15:00:03.995565 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-kube-api-access-5t9qh" (OuterVolumeSpecName: "kube-api-access-5t9qh") pod "e0f939e8-6b57-4d66-8b3d-c5a8b9df2191" (UID: "e0f939e8-6b57-4d66-8b3d-c5a8b9df2191"). InnerVolumeSpecName "kube-api-access-5t9qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:00:04 crc kubenswrapper[4636]: I1003 15:00:04.003887 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e0f939e8-6b57-4d66-8b3d-c5a8b9df2191" (UID: "e0f939e8-6b57-4d66-8b3d-c5a8b9df2191"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:00:04 crc kubenswrapper[4636]: I1003 15:00:04.085967 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t9qh\" (UniqueName: \"kubernetes.io/projected/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-kube-api-access-5t9qh\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:04 crc kubenswrapper[4636]: I1003 15:00:04.086011 4636 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:04 crc kubenswrapper[4636]: I1003 15:00:04.086020 4636 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0f939e8-6b57-4d66-8b3d-c5a8b9df2191-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:00:04 crc kubenswrapper[4636]: I1003 15:00:04.271151 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" Oct 03 15:00:04 crc kubenswrapper[4636]: I1003 15:00:04.271114 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325060-gt665" event={"ID":"e0f939e8-6b57-4d66-8b3d-c5a8b9df2191","Type":"ContainerDied","Data":"10a96646fe9f2ba5a059ea54a005eeafdf06327d6ad3cfc57081a7e37d5734f5"} Oct 03 15:00:04 crc kubenswrapper[4636]: I1003 15:00:04.271264 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10a96646fe9f2ba5a059ea54a005eeafdf06327d6ad3cfc57081a7e37d5734f5" Oct 03 15:00:04 crc kubenswrapper[4636]: I1003 15:00:04.349406 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268"] Oct 03 15:00:04 crc kubenswrapper[4636]: I1003 15:00:04.357721 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325015-wk268"] Oct 03 15:00:04 crc kubenswrapper[4636]: I1003 15:00:04.806696 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc638615-ad90-437e-ad21-6b25821b92f1" path="/var/lib/kubelet/pods/cc638615-ad90-437e-ad21-6b25821b92f1/volumes" Oct 03 15:00:09 crc kubenswrapper[4636]: I1003 15:00:09.163479 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:00:09 crc kubenswrapper[4636]: I1003 15:00:09.165526 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:00:10 crc kubenswrapper[4636]: I1003 15:00:10.341636 4636 scope.go:117] "RemoveContainer" containerID="01ad34bfabeabf5463339da228b48b91fe77ca5f1d4dcaa087aa15baf05d59b5" Oct 03 15:00:39 crc kubenswrapper[4636]: I1003 15:00:39.162872 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:00:39 crc kubenswrapper[4636]: I1003 15:00:39.163397 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.147167 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29325061-4mrxv"] Oct 03 15:01:00 crc kubenswrapper[4636]: E1003 15:01:00.149022 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f939e8-6b57-4d66-8b3d-c5a8b9df2191" containerName="collect-profiles" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.149108 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f939e8-6b57-4d66-8b3d-c5a8b9df2191" containerName="collect-profiles" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.149410 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f939e8-6b57-4d66-8b3d-c5a8b9df2191" containerName="collect-profiles" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.150130 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.154457 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29325061-4mrxv"] Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.280686 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spzvc\" (UniqueName: \"kubernetes.io/projected/e2f61a03-c4e7-414d-b6f9-b1f920d35757-kube-api-access-spzvc\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.280750 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-config-data\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.280808 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-fernet-keys\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.280938 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-combined-ca-bundle\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.382842 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spzvc\" (UniqueName: \"kubernetes.io/projected/e2f61a03-c4e7-414d-b6f9-b1f920d35757-kube-api-access-spzvc\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.382951 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-config-data\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.382991 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-fernet-keys\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.383949 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-combined-ca-bundle\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.389447 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-fernet-keys\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.389995 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-config-data\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.397243 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-combined-ca-bundle\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.410182 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spzvc\" (UniqueName: \"kubernetes.io/projected/e2f61a03-c4e7-414d-b6f9-b1f920d35757-kube-api-access-spzvc\") pod \"keystone-cron-29325061-4mrxv\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.476566 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:00 crc kubenswrapper[4636]: I1003 15:01:00.946840 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29325061-4mrxv"] Oct 03 15:01:01 crc kubenswrapper[4636]: I1003 15:01:01.778279 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325061-4mrxv" event={"ID":"e2f61a03-c4e7-414d-b6f9-b1f920d35757","Type":"ContainerStarted","Data":"742499a26e70661f76cefe8f40e7629028bb4dcb00be449ba01b53f49e45dda2"} Oct 03 15:01:01 crc kubenswrapper[4636]: I1003 15:01:01.778762 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325061-4mrxv" event={"ID":"e2f61a03-c4e7-414d-b6f9-b1f920d35757","Type":"ContainerStarted","Data":"0a0f72ac5ce7949008f0faa3cdd43eaa74129839dbb51ef29023a165520691fa"} Oct 03 15:01:04 crc kubenswrapper[4636]: I1003 15:01:04.812612 4636 generic.go:334] "Generic (PLEG): container finished" podID="e2f61a03-c4e7-414d-b6f9-b1f920d35757" containerID="742499a26e70661f76cefe8f40e7629028bb4dcb00be449ba01b53f49e45dda2" exitCode=0 Oct 03 15:01:04 crc kubenswrapper[4636]: I1003 15:01:04.812681 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325061-4mrxv" event={"ID":"e2f61a03-c4e7-414d-b6f9-b1f920d35757","Type":"ContainerDied","Data":"742499a26e70661f76cefe8f40e7629028bb4dcb00be449ba01b53f49e45dda2"} Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.368646 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.448283 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-config-data\") pod \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.448324 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-fernet-keys\") pod \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.448356 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spzvc\" (UniqueName: \"kubernetes.io/projected/e2f61a03-c4e7-414d-b6f9-b1f920d35757-kube-api-access-spzvc\") pod \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.448467 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-combined-ca-bundle\") pod \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\" (UID: \"e2f61a03-c4e7-414d-b6f9-b1f920d35757\") " Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.471805 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f61a03-c4e7-414d-b6f9-b1f920d35757-kube-api-access-spzvc" (OuterVolumeSpecName: "kube-api-access-spzvc") pod "e2f61a03-c4e7-414d-b6f9-b1f920d35757" (UID: "e2f61a03-c4e7-414d-b6f9-b1f920d35757"). InnerVolumeSpecName "kube-api-access-spzvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.471909 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e2f61a03-c4e7-414d-b6f9-b1f920d35757" (UID: "e2f61a03-c4e7-414d-b6f9-b1f920d35757"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.513192 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-config-data" (OuterVolumeSpecName: "config-data") pod "e2f61a03-c4e7-414d-b6f9-b1f920d35757" (UID: "e2f61a03-c4e7-414d-b6f9-b1f920d35757"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.515456 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2f61a03-c4e7-414d-b6f9-b1f920d35757" (UID: "e2f61a03-c4e7-414d-b6f9-b1f920d35757"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.549855 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.549889 4636 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.549901 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spzvc\" (UniqueName: \"kubernetes.io/projected/e2f61a03-c4e7-414d-b6f9-b1f920d35757-kube-api-access-spzvc\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.549911 4636 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f61a03-c4e7-414d-b6f9-b1f920d35757-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.830377 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325061-4mrxv" event={"ID":"e2f61a03-c4e7-414d-b6f9-b1f920d35757","Type":"ContainerDied","Data":"0a0f72ac5ce7949008f0faa3cdd43eaa74129839dbb51ef29023a165520691fa"} Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.830417 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a0f72ac5ce7949008f0faa3cdd43eaa74129839dbb51ef29023a165520691fa" Oct 03 15:01:06 crc kubenswrapper[4636]: I1003 15:01:06.830686 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325061-4mrxv" Oct 03 15:01:09 crc kubenswrapper[4636]: I1003 15:01:09.163235 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:01:09 crc kubenswrapper[4636]: I1003 15:01:09.163575 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:01:09 crc kubenswrapper[4636]: I1003 15:01:09.163622 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 15:01:09 crc kubenswrapper[4636]: I1003 15:01:09.164498 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:01:09 crc kubenswrapper[4636]: I1003 15:01:09.164553 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" gracePeriod=600 Oct 03 15:01:09 crc kubenswrapper[4636]: E1003 15:01:09.306613 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:01:09 crc kubenswrapper[4636]: I1003 15:01:09.859414 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" exitCode=0 Oct 03 15:01:09 crc kubenswrapper[4636]: I1003 15:01:09.859465 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb"} Oct 03 15:01:09 crc kubenswrapper[4636]: I1003 15:01:09.859504 4636 scope.go:117] "RemoveContainer" containerID="67ba3388c355a3fdeaa7cdee1d86d23b98df73b28126852c58f76489bd32226f" Oct 03 15:01:09 crc kubenswrapper[4636]: I1003 15:01:09.860188 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:01:09 crc kubenswrapper[4636]: E1003 15:01:09.860558 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:01:10 crc kubenswrapper[4636]: I1003 15:01:10.441774 4636 scope.go:117] "RemoveContainer" containerID="e6b4d3f01c2c728d8bdf390a4c77b51c6231b4a01e2e8f9f0f39a7d0ce8e81e1" Oct 03 15:01:10 crc kubenswrapper[4636]: I1003 15:01:10.475426 4636 scope.go:117] "RemoveContainer" containerID="457c3880561c363482cea1820224754ecc6a80f2ab9d720dac0939fb3d9cb2fb" Oct 03 15:01:10 crc kubenswrapper[4636]: I1003 15:01:10.502058 4636 scope.go:117] "RemoveContainer" containerID="653cc7f06bf945ce652bfe0251df158bc5677613f7e4671d171a5604512f5565" Oct 03 15:01:21 crc kubenswrapper[4636]: I1003 15:01:21.794757 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:01:21 crc kubenswrapper[4636]: E1003 15:01:21.795630 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:01:33 crc kubenswrapper[4636]: I1003 15:01:33.793774 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:01:33 crc kubenswrapper[4636]: E1003 15:01:33.795435 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:01:44 crc kubenswrapper[4636]: I1003 15:01:44.794369 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:01:44 crc kubenswrapper[4636]: E1003 15:01:44.795092 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:01:59 crc kubenswrapper[4636]: I1003 15:01:59.794022 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:01:59 crc kubenswrapper[4636]: E1003 15:01:59.795613 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:02:12 crc kubenswrapper[4636]: I1003 15:02:12.794243 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:02:12 crc kubenswrapper[4636]: E1003 15:02:12.796272 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:02:25 crc kubenswrapper[4636]: I1003 15:02:25.794183 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:02:25 crc kubenswrapper[4636]: E1003 15:02:25.794965 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:02:40 crc kubenswrapper[4636]: I1003 15:02:40.800402 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:02:40 crc kubenswrapper[4636]: E1003 15:02:40.801212 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:02:54 crc kubenswrapper[4636]: I1003 15:02:54.795086 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:02:54 crc kubenswrapper[4636]: E1003 15:02:54.796817 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:03:08 crc kubenswrapper[4636]: I1003 15:03:08.794643 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:03:08 crc kubenswrapper[4636]: E1003 15:03:08.796278 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:03:21 crc kubenswrapper[4636]: I1003 15:03:21.795924 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:03:21 crc kubenswrapper[4636]: E1003 15:03:21.797064 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:03:36 crc kubenswrapper[4636]: I1003 15:03:36.794671 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:03:36 crc kubenswrapper[4636]: E1003 15:03:36.795445 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:03:47 crc kubenswrapper[4636]: I1003 15:03:47.794344 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:03:47 crc kubenswrapper[4636]: E1003 15:03:47.795184 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:03:58 crc kubenswrapper[4636]: I1003 15:03:58.793733 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:03:58 crc kubenswrapper[4636]: E1003 15:03:58.794515 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:04:13 crc kubenswrapper[4636]: I1003 15:04:13.794286 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:04:13 crc kubenswrapper[4636]: E1003 15:04:13.794955 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:04:27 crc kubenswrapper[4636]: I1003 15:04:27.793775 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:04:27 crc kubenswrapper[4636]: E1003 15:04:27.794614 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.413697 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j82hl"] Oct 03 15:04:34 crc kubenswrapper[4636]: E1003 15:04:34.414708 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f61a03-c4e7-414d-b6f9-b1f920d35757" containerName="keystone-cron" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.414722 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f61a03-c4e7-414d-b6f9-b1f920d35757" containerName="keystone-cron" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.414902 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f61a03-c4e7-414d-b6f9-b1f920d35757" containerName="keystone-cron" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.416348 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.434092 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j82hl"] Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.470350 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-catalog-content\") pod \"redhat-marketplace-j82hl\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.470493 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9pqp\" (UniqueName: \"kubernetes.io/projected/03740898-1b7a-431c-ab83-6302efd7f921-kube-api-access-j9pqp\") pod \"redhat-marketplace-j82hl\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.470545 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-utilities\") pod \"redhat-marketplace-j82hl\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.571892 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9pqp\" (UniqueName: \"kubernetes.io/projected/03740898-1b7a-431c-ab83-6302efd7f921-kube-api-access-j9pqp\") pod \"redhat-marketplace-j82hl\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.572211 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-utilities\") pod \"redhat-marketplace-j82hl\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.572287 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-catalog-content\") pod \"redhat-marketplace-j82hl\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.572675 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-utilities\") pod \"redhat-marketplace-j82hl\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.572763 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-catalog-content\") pod \"redhat-marketplace-j82hl\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.663770 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9pqp\" (UniqueName: \"kubernetes.io/projected/03740898-1b7a-431c-ab83-6302efd7f921-kube-api-access-j9pqp\") pod \"redhat-marketplace-j82hl\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:34 crc kubenswrapper[4636]: I1003 15:04:34.743919 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:35 crc kubenswrapper[4636]: I1003 15:04:35.317904 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j82hl"] Oct 03 15:04:35 crc kubenswrapper[4636]: I1003 15:04:35.644238 4636 generic.go:334] "Generic (PLEG): container finished" podID="03740898-1b7a-431c-ab83-6302efd7f921" containerID="b6cf0960765cf89646ed75cbfaf651e70b96af71c5053dc7c776dc2c49350c35" exitCode=0 Oct 03 15:04:35 crc kubenswrapper[4636]: I1003 15:04:35.644535 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j82hl" event={"ID":"03740898-1b7a-431c-ab83-6302efd7f921","Type":"ContainerDied","Data":"b6cf0960765cf89646ed75cbfaf651e70b96af71c5053dc7c776dc2c49350c35"} Oct 03 15:04:35 crc kubenswrapper[4636]: I1003 15:04:35.645381 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j82hl" event={"ID":"03740898-1b7a-431c-ab83-6302efd7f921","Type":"ContainerStarted","Data":"54abb73ee8040927cee127616c3cfc9e9afe7699eff8e0c5794e7fd5fdb56c0d"} Oct 03 15:04:36 crc kubenswrapper[4636]: I1003 15:04:36.656150 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j82hl" event={"ID":"03740898-1b7a-431c-ab83-6302efd7f921","Type":"ContainerStarted","Data":"eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556"} Oct 03 15:04:37 crc kubenswrapper[4636]: I1003 15:04:37.666554 4636 generic.go:334] "Generic (PLEG): container finished" podID="03740898-1b7a-431c-ab83-6302efd7f921" containerID="eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556" exitCode=0 Oct 03 15:04:37 crc kubenswrapper[4636]: I1003 15:04:37.666745 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j82hl" event={"ID":"03740898-1b7a-431c-ab83-6302efd7f921","Type":"ContainerDied","Data":"eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556"} Oct 03 15:04:38 crc kubenswrapper[4636]: I1003 15:04:38.678288 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j82hl" event={"ID":"03740898-1b7a-431c-ab83-6302efd7f921","Type":"ContainerStarted","Data":"8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d"} Oct 03 15:04:38 crc kubenswrapper[4636]: I1003 15:04:38.706129 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j82hl" podStartSLOduration=2.205658535 podStartE2EDuration="4.70609227s" podCreationTimestamp="2025-10-03 15:04:34 +0000 UTC" firstStartedPulling="2025-10-03 15:04:35.645996886 +0000 UTC m=+3825.504723133" lastFinishedPulling="2025-10-03 15:04:38.146430621 +0000 UTC m=+3828.005156868" observedRunningTime="2025-10-03 15:04:38.698766277 +0000 UTC m=+3828.557492524" watchObservedRunningTime="2025-10-03 15:04:38.70609227 +0000 UTC m=+3828.564818517" Oct 03 15:04:38 crc kubenswrapper[4636]: I1003 15:04:38.795738 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:04:38 crc kubenswrapper[4636]: E1003 15:04:38.795994 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:04:44 crc kubenswrapper[4636]: I1003 15:04:44.744845 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:44 crc kubenswrapper[4636]: I1003 15:04:44.745789 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:44 crc kubenswrapper[4636]: I1003 15:04:44.807746 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:45 crc kubenswrapper[4636]: I1003 15:04:45.781051 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:45 crc kubenswrapper[4636]: I1003 15:04:45.834912 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j82hl"] Oct 03 15:04:47 crc kubenswrapper[4636]: I1003 15:04:47.770625 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j82hl" podUID="03740898-1b7a-431c-ab83-6302efd7f921" containerName="registry-server" containerID="cri-o://8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d" gracePeriod=2 Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.388962 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.552988 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-utilities\") pod \"03740898-1b7a-431c-ab83-6302efd7f921\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.553086 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-catalog-content\") pod \"03740898-1b7a-431c-ab83-6302efd7f921\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.553211 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9pqp\" (UniqueName: \"kubernetes.io/projected/03740898-1b7a-431c-ab83-6302efd7f921-kube-api-access-j9pqp\") pod \"03740898-1b7a-431c-ab83-6302efd7f921\" (UID: \"03740898-1b7a-431c-ab83-6302efd7f921\") " Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.554052 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-utilities" (OuterVolumeSpecName: "utilities") pod "03740898-1b7a-431c-ab83-6302efd7f921" (UID: "03740898-1b7a-431c-ab83-6302efd7f921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.561226 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03740898-1b7a-431c-ab83-6302efd7f921-kube-api-access-j9pqp" (OuterVolumeSpecName: "kube-api-access-j9pqp") pod "03740898-1b7a-431c-ab83-6302efd7f921" (UID: "03740898-1b7a-431c-ab83-6302efd7f921"). InnerVolumeSpecName "kube-api-access-j9pqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.566795 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03740898-1b7a-431c-ab83-6302efd7f921" (UID: "03740898-1b7a-431c-ab83-6302efd7f921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.654936 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.655326 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03740898-1b7a-431c-ab83-6302efd7f921-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.655343 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9pqp\" (UniqueName: \"kubernetes.io/projected/03740898-1b7a-431c-ab83-6302efd7f921-kube-api-access-j9pqp\") on node \"crc\" DevicePath \"\"" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.783080 4636 generic.go:334] "Generic (PLEG): container finished" podID="03740898-1b7a-431c-ab83-6302efd7f921" containerID="8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d" exitCode=0 Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.783142 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j82hl" event={"ID":"03740898-1b7a-431c-ab83-6302efd7f921","Type":"ContainerDied","Data":"8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d"} Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.783178 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j82hl" event={"ID":"03740898-1b7a-431c-ab83-6302efd7f921","Type":"ContainerDied","Data":"54abb73ee8040927cee127616c3cfc9e9afe7699eff8e0c5794e7fd5fdb56c0d"} Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.783199 4636 scope.go:117] "RemoveContainer" containerID="8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.783249 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j82hl" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.807719 4636 scope.go:117] "RemoveContainer" containerID="eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.830826 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j82hl"] Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.844674 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j82hl"] Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.845030 4636 scope.go:117] "RemoveContainer" containerID="b6cf0960765cf89646ed75cbfaf651e70b96af71c5053dc7c776dc2c49350c35" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.884952 4636 scope.go:117] "RemoveContainer" containerID="8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d" Oct 03 15:04:48 crc kubenswrapper[4636]: E1003 15:04:48.886362 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d\": container with ID starting with 8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d not found: ID does not exist" containerID="8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.886421 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d"} err="failed to get container status \"8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d\": rpc error: code = NotFound desc = could not find container \"8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d\": container with ID starting with 8180121cae73ba01bf955236c0c0e0b1f9f2b75e9febbb5d51e959181b5b3c4d not found: ID does not exist" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.886449 4636 scope.go:117] "RemoveContainer" containerID="eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556" Oct 03 15:04:48 crc kubenswrapper[4636]: E1003 15:04:48.886900 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556\": container with ID starting with eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556 not found: ID does not exist" containerID="eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.886971 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556"} err="failed to get container status \"eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556\": rpc error: code = NotFound desc = could not find container \"eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556\": container with ID starting with eb257d33718ed5980e808c6c8ff269276260797ba9f48180abd90e4c6e52c556 not found: ID does not exist" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.887006 4636 scope.go:117] "RemoveContainer" containerID="b6cf0960765cf89646ed75cbfaf651e70b96af71c5053dc7c776dc2c49350c35" Oct 03 15:04:48 crc kubenswrapper[4636]: E1003 15:04:48.887401 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6cf0960765cf89646ed75cbfaf651e70b96af71c5053dc7c776dc2c49350c35\": container with ID starting with b6cf0960765cf89646ed75cbfaf651e70b96af71c5053dc7c776dc2c49350c35 not found: ID does not exist" containerID="b6cf0960765cf89646ed75cbfaf651e70b96af71c5053dc7c776dc2c49350c35" Oct 03 15:04:48 crc kubenswrapper[4636]: I1003 15:04:48.887442 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cf0960765cf89646ed75cbfaf651e70b96af71c5053dc7c776dc2c49350c35"} err="failed to get container status \"b6cf0960765cf89646ed75cbfaf651e70b96af71c5053dc7c776dc2c49350c35\": rpc error: code = NotFound desc = could not find container \"b6cf0960765cf89646ed75cbfaf651e70b96af71c5053dc7c776dc2c49350c35\": container with ID starting with b6cf0960765cf89646ed75cbfaf651e70b96af71c5053dc7c776dc2c49350c35 not found: ID does not exist" Oct 03 15:04:50 crc kubenswrapper[4636]: I1003 15:04:50.804780 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03740898-1b7a-431c-ab83-6302efd7f921" path="/var/lib/kubelet/pods/03740898-1b7a-431c-ab83-6302efd7f921/volumes" Oct 03 15:04:53 crc kubenswrapper[4636]: I1003 15:04:53.793856 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:04:53 crc kubenswrapper[4636]: E1003 15:04:53.795310 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:05:04 crc kubenswrapper[4636]: I1003 15:05:04.793705 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:05:04 crc kubenswrapper[4636]: E1003 15:05:04.794479 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:05:19 crc kubenswrapper[4636]: I1003 15:05:19.794267 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:05:19 crc kubenswrapper[4636]: E1003 15:05:19.795298 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:05:34 crc kubenswrapper[4636]: I1003 15:05:34.794578 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:05:34 crc kubenswrapper[4636]: E1003 15:05:34.795296 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:05:47 crc kubenswrapper[4636]: I1003 15:05:47.793658 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:05:47 crc kubenswrapper[4636]: E1003 15:05:47.794324 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:06:00 crc kubenswrapper[4636]: I1003 15:06:00.801183 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:06:00 crc kubenswrapper[4636]: E1003 15:06:00.802014 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:06:14 crc kubenswrapper[4636]: I1003 15:06:14.794491 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:06:15 crc kubenswrapper[4636]: I1003 15:06:15.518940 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"84a5dd6102f235276b0029df84d8db680ef147a39fc2e71bdb50b3c8976a0eee"} Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.187054 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zxbms"] Oct 03 15:06:28 crc kubenswrapper[4636]: E1003 15:06:28.188047 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03740898-1b7a-431c-ab83-6302efd7f921" containerName="registry-server" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.188063 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="03740898-1b7a-431c-ab83-6302efd7f921" containerName="registry-server" Oct 03 15:06:28 crc kubenswrapper[4636]: E1003 15:06:28.188083 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03740898-1b7a-431c-ab83-6302efd7f921" containerName="extract-content" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.188091 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="03740898-1b7a-431c-ab83-6302efd7f921" containerName="extract-content" Oct 03 15:06:28 crc kubenswrapper[4636]: E1003 15:06:28.188144 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03740898-1b7a-431c-ab83-6302efd7f921" containerName="extract-utilities" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.188153 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="03740898-1b7a-431c-ab83-6302efd7f921" containerName="extract-utilities" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.188385 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="03740898-1b7a-431c-ab83-6302efd7f921" containerName="registry-server" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.189985 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.201151 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxbms"] Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.262798 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-utilities\") pod \"redhat-operators-zxbms\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.263093 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggtwz\" (UniqueName: \"kubernetes.io/projected/6cf3288e-2ccc-4249-afba-0976f0696f7d-kube-api-access-ggtwz\") pod \"redhat-operators-zxbms\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.263133 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-catalog-content\") pod \"redhat-operators-zxbms\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.365067 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-catalog-content\") pod \"redhat-operators-zxbms\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.365326 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-utilities\") pod \"redhat-operators-zxbms\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.365373 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggtwz\" (UniqueName: \"kubernetes.io/projected/6cf3288e-2ccc-4249-afba-0976f0696f7d-kube-api-access-ggtwz\") pod \"redhat-operators-zxbms\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.365590 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-catalog-content\") pod \"redhat-operators-zxbms\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.365802 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-utilities\") pod \"redhat-operators-zxbms\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.396597 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggtwz\" (UniqueName: \"kubernetes.io/projected/6cf3288e-2ccc-4249-afba-0976f0696f7d-kube-api-access-ggtwz\") pod \"redhat-operators-zxbms\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:28 crc kubenswrapper[4636]: I1003 15:06:28.510318 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:29 crc kubenswrapper[4636]: I1003 15:06:29.035253 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxbms"] Oct 03 15:06:29 crc kubenswrapper[4636]: I1003 15:06:29.697239 4636 generic.go:334] "Generic (PLEG): container finished" podID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerID="3d6b34d9de2d492f0211b262ce7c5f0ba1f30f7801468aaebfc5f0a3357c3846" exitCode=0 Oct 03 15:06:29 crc kubenswrapper[4636]: I1003 15:06:29.697351 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxbms" event={"ID":"6cf3288e-2ccc-4249-afba-0976f0696f7d","Type":"ContainerDied","Data":"3d6b34d9de2d492f0211b262ce7c5f0ba1f30f7801468aaebfc5f0a3357c3846"} Oct 03 15:06:29 crc kubenswrapper[4636]: I1003 15:06:29.697522 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxbms" event={"ID":"6cf3288e-2ccc-4249-afba-0976f0696f7d","Type":"ContainerStarted","Data":"161a685422371c3b018005d3546993356470bed55f6df443d3b4b28950273413"} Oct 03 15:06:29 crc kubenswrapper[4636]: I1003 15:06:29.699547 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:06:31 crc kubenswrapper[4636]: I1003 15:06:31.715305 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxbms" event={"ID":"6cf3288e-2ccc-4249-afba-0976f0696f7d","Type":"ContainerStarted","Data":"4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0"} Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.760766 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tfcq5"] Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.762865 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.771988 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfcq5"] Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.869566 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-catalog-content\") pod \"certified-operators-tfcq5\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.870046 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-utilities\") pod \"certified-operators-tfcq5\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.870180 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s7qp\" (UniqueName: \"kubernetes.io/projected/ae522afa-cb70-408e-9025-1a668ffc23cb-kube-api-access-6s7qp\") pod \"certified-operators-tfcq5\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.972133 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-utilities\") pod \"certified-operators-tfcq5\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.972450 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s7qp\" (UniqueName: \"kubernetes.io/projected/ae522afa-cb70-408e-9025-1a668ffc23cb-kube-api-access-6s7qp\") pod \"certified-operators-tfcq5\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.972482 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-catalog-content\") pod \"certified-operators-tfcq5\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.972711 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-utilities\") pod \"certified-operators-tfcq5\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.972908 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-catalog-content\") pod \"certified-operators-tfcq5\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:33 crc kubenswrapper[4636]: I1003 15:06:33.996883 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s7qp\" (UniqueName: \"kubernetes.io/projected/ae522afa-cb70-408e-9025-1a668ffc23cb-kube-api-access-6s7qp\") pod \"certified-operators-tfcq5\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:34 crc kubenswrapper[4636]: I1003 15:06:34.081931 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:34 crc kubenswrapper[4636]: I1003 15:06:34.694126 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfcq5"] Oct 03 15:06:34 crc kubenswrapper[4636]: I1003 15:06:34.750459 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcq5" event={"ID":"ae522afa-cb70-408e-9025-1a668ffc23cb","Type":"ContainerStarted","Data":"05369e978f297c17c7067d8dda9b32b0aa4f723d221a689dd84a18dee82da2c2"} Oct 03 15:06:35 crc kubenswrapper[4636]: I1003 15:06:35.761141 4636 generic.go:334] "Generic (PLEG): container finished" podID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerID="738c89f2f118e9221adc6e6c4fd318f33a6365c2f28e588043e97ed76d501ee2" exitCode=0 Oct 03 15:06:35 crc kubenswrapper[4636]: I1003 15:06:35.761288 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcq5" event={"ID":"ae522afa-cb70-408e-9025-1a668ffc23cb","Type":"ContainerDied","Data":"738c89f2f118e9221adc6e6c4fd318f33a6365c2f28e588043e97ed76d501ee2"} Oct 03 15:06:36 crc kubenswrapper[4636]: I1003 15:06:36.775274 4636 generic.go:334] "Generic (PLEG): container finished" podID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerID="4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0" exitCode=0 Oct 03 15:06:36 crc kubenswrapper[4636]: I1003 15:06:36.775301 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxbms" event={"ID":"6cf3288e-2ccc-4249-afba-0976f0696f7d","Type":"ContainerDied","Data":"4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0"} Oct 03 15:06:37 crc kubenswrapper[4636]: I1003 15:06:37.788059 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcq5" event={"ID":"ae522afa-cb70-408e-9025-1a668ffc23cb","Type":"ContainerStarted","Data":"d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996"} Oct 03 15:06:37 crc kubenswrapper[4636]: I1003 15:06:37.794041 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxbms" event={"ID":"6cf3288e-2ccc-4249-afba-0976f0696f7d","Type":"ContainerStarted","Data":"41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5"} Oct 03 15:06:37 crc kubenswrapper[4636]: I1003 15:06:37.829427 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zxbms" podStartSLOduration=2.262302259 podStartE2EDuration="9.829411839s" podCreationTimestamp="2025-10-03 15:06:28 +0000 UTC" firstStartedPulling="2025-10-03 15:06:29.699177589 +0000 UTC m=+3939.557903866" lastFinishedPulling="2025-10-03 15:06:37.266287189 +0000 UTC m=+3947.125013446" observedRunningTime="2025-10-03 15:06:37.820889195 +0000 UTC m=+3947.679615442" watchObservedRunningTime="2025-10-03 15:06:37.829411839 +0000 UTC m=+3947.688138086" Oct 03 15:06:38 crc kubenswrapper[4636]: I1003 15:06:38.510902 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:38 crc kubenswrapper[4636]: I1003 15:06:38.511249 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:06:38 crc kubenswrapper[4636]: I1003 15:06:38.804592 4636 generic.go:334] "Generic (PLEG): container finished" podID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerID="d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996" exitCode=0 Oct 03 15:06:38 crc kubenswrapper[4636]: I1003 15:06:38.809640 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcq5" event={"ID":"ae522afa-cb70-408e-9025-1a668ffc23cb","Type":"ContainerDied","Data":"d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996"} Oct 03 15:06:39 crc kubenswrapper[4636]: I1003 15:06:39.610149 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zxbms" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerName="registry-server" probeResult="failure" output=< Oct 03 15:06:39 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 15:06:39 crc kubenswrapper[4636]: > Oct 03 15:06:43 crc kubenswrapper[4636]: I1003 15:06:43.851294 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcq5" event={"ID":"ae522afa-cb70-408e-9025-1a668ffc23cb","Type":"ContainerStarted","Data":"2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af"} Oct 03 15:06:43 crc kubenswrapper[4636]: I1003 15:06:43.878823 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tfcq5" podStartSLOduration=3.970873917 podStartE2EDuration="10.878799508s" podCreationTimestamp="2025-10-03 15:06:33 +0000 UTC" firstStartedPulling="2025-10-03 15:06:35.763183747 +0000 UTC m=+3945.621910024" lastFinishedPulling="2025-10-03 15:06:42.671109368 +0000 UTC m=+3952.529835615" observedRunningTime="2025-10-03 15:06:43.870274653 +0000 UTC m=+3953.729000920" watchObservedRunningTime="2025-10-03 15:06:43.878799508 +0000 UTC m=+3953.737525755" Oct 03 15:06:44 crc kubenswrapper[4636]: I1003 15:06:44.082795 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:44 crc kubenswrapper[4636]: I1003 15:06:44.082966 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:45 crc kubenswrapper[4636]: I1003 15:06:45.133639 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-tfcq5" podUID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerName="registry-server" probeResult="failure" output=< Oct 03 15:06:45 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 15:06:45 crc kubenswrapper[4636]: > Oct 03 15:06:49 crc kubenswrapper[4636]: I1003 15:06:49.935026 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zxbms" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerName="registry-server" probeResult="failure" output=< Oct 03 15:06:49 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 15:06:49 crc kubenswrapper[4636]: > Oct 03 15:06:54 crc kubenswrapper[4636]: I1003 15:06:54.131738 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:54 crc kubenswrapper[4636]: I1003 15:06:54.180040 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:54 crc kubenswrapper[4636]: I1003 15:06:54.370709 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tfcq5"] Oct 03 15:06:55 crc kubenswrapper[4636]: I1003 15:06:55.948261 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tfcq5" podUID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerName="registry-server" containerID="cri-o://2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af" gracePeriod=2 Oct 03 15:06:56 crc kubenswrapper[4636]: I1003 15:06:56.943744 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:56 crc kubenswrapper[4636]: I1003 15:06:56.959482 4636 generic.go:334] "Generic (PLEG): container finished" podID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerID="2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af" exitCode=0 Oct 03 15:06:56 crc kubenswrapper[4636]: I1003 15:06:56.959546 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcq5" event={"ID":"ae522afa-cb70-408e-9025-1a668ffc23cb","Type":"ContainerDied","Data":"2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af"} Oct 03 15:06:56 crc kubenswrapper[4636]: I1003 15:06:56.959581 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcq5" event={"ID":"ae522afa-cb70-408e-9025-1a668ffc23cb","Type":"ContainerDied","Data":"05369e978f297c17c7067d8dda9b32b0aa4f723d221a689dd84a18dee82da2c2"} Oct 03 15:06:56 crc kubenswrapper[4636]: I1003 15:06:56.959601 4636 scope.go:117] "RemoveContainer" containerID="2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af" Oct 03 15:06:56 crc kubenswrapper[4636]: I1003 15:06:56.959769 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfcq5" Oct 03 15:06:56 crc kubenswrapper[4636]: I1003 15:06:56.995717 4636 scope.go:117] "RemoveContainer" containerID="d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.016450 4636 scope.go:117] "RemoveContainer" containerID="738c89f2f118e9221adc6e6c4fd318f33a6365c2f28e588043e97ed76d501ee2" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.054324 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s7qp\" (UniqueName: \"kubernetes.io/projected/ae522afa-cb70-408e-9025-1a668ffc23cb-kube-api-access-6s7qp\") pod \"ae522afa-cb70-408e-9025-1a668ffc23cb\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.054517 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-catalog-content\") pod \"ae522afa-cb70-408e-9025-1a668ffc23cb\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.054610 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-utilities\") pod \"ae522afa-cb70-408e-9025-1a668ffc23cb\" (UID: \"ae522afa-cb70-408e-9025-1a668ffc23cb\") " Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.055400 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-utilities" (OuterVolumeSpecName: "utilities") pod "ae522afa-cb70-408e-9025-1a668ffc23cb" (UID: "ae522afa-cb70-408e-9025-1a668ffc23cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.060987 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae522afa-cb70-408e-9025-1a668ffc23cb-kube-api-access-6s7qp" (OuterVolumeSpecName: "kube-api-access-6s7qp") pod "ae522afa-cb70-408e-9025-1a668ffc23cb" (UID: "ae522afa-cb70-408e-9025-1a668ffc23cb"). InnerVolumeSpecName "kube-api-access-6s7qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.064614 4636 scope.go:117] "RemoveContainer" containerID="2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af" Oct 03 15:06:57 crc kubenswrapper[4636]: E1003 15:06:57.065021 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af\": container with ID starting with 2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af not found: ID does not exist" containerID="2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.065058 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af"} err="failed to get container status \"2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af\": rpc error: code = NotFound desc = could not find container \"2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af\": container with ID starting with 2175d7054975ab2d937f6ec03faadeb06b4f48ed87ab21996867c8834d39e0af not found: ID does not exist" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.065082 4636 scope.go:117] "RemoveContainer" containerID="d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996" Oct 03 15:06:57 crc kubenswrapper[4636]: E1003 15:06:57.065685 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996\": container with ID starting with d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996 not found: ID does not exist" containerID="d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.065718 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996"} err="failed to get container status \"d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996\": rpc error: code = NotFound desc = could not find container \"d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996\": container with ID starting with d7c0d962b7c4ef50e2e4b7789be4b4e528ff261ad1eb55437b5a34542e7d8996 not found: ID does not exist" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.065741 4636 scope.go:117] "RemoveContainer" containerID="738c89f2f118e9221adc6e6c4fd318f33a6365c2f28e588043e97ed76d501ee2" Oct 03 15:06:57 crc kubenswrapper[4636]: E1003 15:06:57.066046 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"738c89f2f118e9221adc6e6c4fd318f33a6365c2f28e588043e97ed76d501ee2\": container with ID starting with 738c89f2f118e9221adc6e6c4fd318f33a6365c2f28e588043e97ed76d501ee2 not found: ID does not exist" containerID="738c89f2f118e9221adc6e6c4fd318f33a6365c2f28e588043e97ed76d501ee2" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.066079 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738c89f2f118e9221adc6e6c4fd318f33a6365c2f28e588043e97ed76d501ee2"} err="failed to get container status \"738c89f2f118e9221adc6e6c4fd318f33a6365c2f28e588043e97ed76d501ee2\": rpc error: code = NotFound desc = could not find container \"738c89f2f118e9221adc6e6c4fd318f33a6365c2f28e588043e97ed76d501ee2\": container with ID starting with 738c89f2f118e9221adc6e6c4fd318f33a6365c2f28e588043e97ed76d501ee2 not found: ID does not exist" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.105003 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae522afa-cb70-408e-9025-1a668ffc23cb" (UID: "ae522afa-cb70-408e-9025-1a668ffc23cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.157419 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s7qp\" (UniqueName: \"kubernetes.io/projected/ae522afa-cb70-408e-9025-1a668ffc23cb-kube-api-access-6s7qp\") on node \"crc\" DevicePath \"\"" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.157491 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.157510 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae522afa-cb70-408e-9025-1a668ffc23cb-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.666190 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tfcq5"] Oct 03 15:06:57 crc kubenswrapper[4636]: I1003 15:06:57.675911 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tfcq5"] Oct 03 15:06:58 crc kubenswrapper[4636]: I1003 15:06:58.807246 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae522afa-cb70-408e-9025-1a668ffc23cb" path="/var/lib/kubelet/pods/ae522afa-cb70-408e-9025-1a668ffc23cb/volumes" Oct 03 15:06:59 crc kubenswrapper[4636]: I1003 15:06:59.572215 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zxbms" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerName="registry-server" probeResult="failure" output=< Oct 03 15:06:59 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 15:06:59 crc kubenswrapper[4636]: > Oct 03 15:07:08 crc kubenswrapper[4636]: I1003 15:07:08.933010 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:07:08 crc kubenswrapper[4636]: I1003 15:07:08.993906 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:07:09 crc kubenswrapper[4636]: I1003 15:07:09.176965 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxbms"] Oct 03 15:07:10 crc kubenswrapper[4636]: I1003 15:07:10.070205 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zxbms" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerName="registry-server" containerID="cri-o://41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5" gracePeriod=2 Oct 03 15:07:10 crc kubenswrapper[4636]: I1003 15:07:10.732897 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:07:10 crc kubenswrapper[4636]: I1003 15:07:10.820562 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggtwz\" (UniqueName: \"kubernetes.io/projected/6cf3288e-2ccc-4249-afba-0976f0696f7d-kube-api-access-ggtwz\") pod \"6cf3288e-2ccc-4249-afba-0976f0696f7d\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " Oct 03 15:07:10 crc kubenswrapper[4636]: I1003 15:07:10.820803 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-utilities\") pod \"6cf3288e-2ccc-4249-afba-0976f0696f7d\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " Oct 03 15:07:10 crc kubenswrapper[4636]: I1003 15:07:10.820898 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-catalog-content\") pod \"6cf3288e-2ccc-4249-afba-0976f0696f7d\" (UID: \"6cf3288e-2ccc-4249-afba-0976f0696f7d\") " Oct 03 15:07:10 crc kubenswrapper[4636]: I1003 15:07:10.821731 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-utilities" (OuterVolumeSpecName: "utilities") pod "6cf3288e-2ccc-4249-afba-0976f0696f7d" (UID: "6cf3288e-2ccc-4249-afba-0976f0696f7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:07:10 crc kubenswrapper[4636]: I1003 15:07:10.839426 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf3288e-2ccc-4249-afba-0976f0696f7d-kube-api-access-ggtwz" (OuterVolumeSpecName: "kube-api-access-ggtwz") pod "6cf3288e-2ccc-4249-afba-0976f0696f7d" (UID: "6cf3288e-2ccc-4249-afba-0976f0696f7d"). InnerVolumeSpecName "kube-api-access-ggtwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:07:10 crc kubenswrapper[4636]: I1003 15:07:10.923069 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:07:10 crc kubenswrapper[4636]: I1003 15:07:10.923495 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggtwz\" (UniqueName: \"kubernetes.io/projected/6cf3288e-2ccc-4249-afba-0976f0696f7d-kube-api-access-ggtwz\") on node \"crc\" DevicePath \"\"" Oct 03 15:07:10 crc kubenswrapper[4636]: I1003 15:07:10.932472 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cf3288e-2ccc-4249-afba-0976f0696f7d" (UID: "6cf3288e-2ccc-4249-afba-0976f0696f7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.025809 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cf3288e-2ccc-4249-afba-0976f0696f7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.080425 4636 generic.go:334] "Generic (PLEG): container finished" podID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerID="41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5" exitCode=0 Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.080468 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxbms" event={"ID":"6cf3288e-2ccc-4249-afba-0976f0696f7d","Type":"ContainerDied","Data":"41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5"} Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.080495 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxbms" event={"ID":"6cf3288e-2ccc-4249-afba-0976f0696f7d","Type":"ContainerDied","Data":"161a685422371c3b018005d3546993356470bed55f6df443d3b4b28950273413"} Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.080513 4636 scope.go:117] "RemoveContainer" containerID="41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5" Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.080514 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxbms" Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.099988 4636 scope.go:117] "RemoveContainer" containerID="4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0" Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.115846 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxbms"] Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.124318 4636 scope.go:117] "RemoveContainer" containerID="3d6b34d9de2d492f0211b262ce7c5f0ba1f30f7801468aaebfc5f0a3357c3846" Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.126963 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zxbms"] Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.164187 4636 scope.go:117] "RemoveContainer" containerID="41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5" Oct 03 15:07:11 crc kubenswrapper[4636]: E1003 15:07:11.164712 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5\": container with ID starting with 41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5 not found: ID does not exist" containerID="41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5" Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.164749 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5"} err="failed to get container status \"41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5\": rpc error: code = NotFound desc = could not find container \"41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5\": container with ID starting with 41b6b3a4a44ac6b69036859496f8530d677821545ff077eccf8578cd9d01a0e5 not found: ID does not exist" Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.164776 4636 scope.go:117] "RemoveContainer" containerID="4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0" Oct 03 15:07:11 crc kubenswrapper[4636]: E1003 15:07:11.165070 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0\": container with ID starting with 4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0 not found: ID does not exist" containerID="4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0" Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.165124 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0"} err="failed to get container status \"4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0\": rpc error: code = NotFound desc = could not find container \"4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0\": container with ID starting with 4365e86cf6e1e2a2f97f0234ce42516d4b52c3cd5a7c6f947d08c583a11e0ee0 not found: ID does not exist" Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.165153 4636 scope.go:117] "RemoveContainer" containerID="3d6b34d9de2d492f0211b262ce7c5f0ba1f30f7801468aaebfc5f0a3357c3846" Oct 03 15:07:11 crc kubenswrapper[4636]: E1003 15:07:11.166902 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d6b34d9de2d492f0211b262ce7c5f0ba1f30f7801468aaebfc5f0a3357c3846\": container with ID starting with 3d6b34d9de2d492f0211b262ce7c5f0ba1f30f7801468aaebfc5f0a3357c3846 not found: ID does not exist" containerID="3d6b34d9de2d492f0211b262ce7c5f0ba1f30f7801468aaebfc5f0a3357c3846" Oct 03 15:07:11 crc kubenswrapper[4636]: I1003 15:07:11.166944 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6b34d9de2d492f0211b262ce7c5f0ba1f30f7801468aaebfc5f0a3357c3846"} err="failed to get container status \"3d6b34d9de2d492f0211b262ce7c5f0ba1f30f7801468aaebfc5f0a3357c3846\": rpc error: code = NotFound desc = could not find container \"3d6b34d9de2d492f0211b262ce7c5f0ba1f30f7801468aaebfc5f0a3357c3846\": container with ID starting with 3d6b34d9de2d492f0211b262ce7c5f0ba1f30f7801468aaebfc5f0a3357c3846 not found: ID does not exist" Oct 03 15:07:12 crc kubenswrapper[4636]: I1003 15:07:12.804883 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" path="/var/lib/kubelet/pods/6cf3288e-2ccc-4249-afba-0976f0696f7d/volumes" Oct 03 15:08:39 crc kubenswrapper[4636]: I1003 15:08:39.163452 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:08:39 crc kubenswrapper[4636]: I1003 15:08:39.164006 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:09:09 crc kubenswrapper[4636]: I1003 15:09:09.162599 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:09:09 crc kubenswrapper[4636]: I1003 15:09:09.163071 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:09:39 crc kubenswrapper[4636]: I1003 15:09:39.162767 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:09:39 crc kubenswrapper[4636]: I1003 15:09:39.163345 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:09:39 crc kubenswrapper[4636]: I1003 15:09:39.163386 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 15:09:39 crc kubenswrapper[4636]: I1003 15:09:39.163862 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84a5dd6102f235276b0029df84d8db680ef147a39fc2e71bdb50b3c8976a0eee"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:09:39 crc kubenswrapper[4636]: I1003 15:09:39.163911 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://84a5dd6102f235276b0029df84d8db680ef147a39fc2e71bdb50b3c8976a0eee" gracePeriod=600 Oct 03 15:09:39 crc kubenswrapper[4636]: I1003 15:09:39.437056 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="84a5dd6102f235276b0029df84d8db680ef147a39fc2e71bdb50b3c8976a0eee" exitCode=0 Oct 03 15:09:39 crc kubenswrapper[4636]: I1003 15:09:39.437142 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"84a5dd6102f235276b0029df84d8db680ef147a39fc2e71bdb50b3c8976a0eee"} Oct 03 15:09:39 crc kubenswrapper[4636]: I1003 15:09:39.437244 4636 scope.go:117] "RemoveContainer" containerID="45ca0bf3c16a815738d05770e6a44fe066633f8c22bcf2041de1ad871e542acb" Oct 03 15:09:40 crc kubenswrapper[4636]: I1003 15:09:40.448840 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc"} Oct 03 15:11:39 crc kubenswrapper[4636]: I1003 15:11:39.162725 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:11:39 crc kubenswrapper[4636]: I1003 15:11:39.163384 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:12:09 crc kubenswrapper[4636]: I1003 15:12:09.163388 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:12:09 crc kubenswrapper[4636]: I1003 15:12:09.163907 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:12:39 crc kubenswrapper[4636]: I1003 15:12:39.163474 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:12:39 crc kubenswrapper[4636]: I1003 15:12:39.164028 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:12:39 crc kubenswrapper[4636]: I1003 15:12:39.164077 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 15:12:39 crc kubenswrapper[4636]: I1003 15:12:39.164868 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:12:39 crc kubenswrapper[4636]: I1003 15:12:39.164927 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" gracePeriod=600 Oct 03 15:12:39 crc kubenswrapper[4636]: E1003 15:12:39.306562 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:12:40 crc kubenswrapper[4636]: I1003 15:12:40.017027 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" exitCode=0 Oct 03 15:12:40 crc kubenswrapper[4636]: I1003 15:12:40.017074 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc"} Oct 03 15:12:40 crc kubenswrapper[4636]: I1003 15:12:40.017410 4636 scope.go:117] "RemoveContainer" containerID="84a5dd6102f235276b0029df84d8db680ef147a39fc2e71bdb50b3c8976a0eee" Oct 03 15:12:40 crc kubenswrapper[4636]: I1003 15:12:40.017897 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:12:40 crc kubenswrapper[4636]: E1003 15:12:40.018214 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.309513 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hwpxp"] Oct 03 15:12:43 crc kubenswrapper[4636]: E1003 15:12:43.311644 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerName="extract-content" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.311748 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerName="extract-content" Oct 03 15:12:43 crc kubenswrapper[4636]: E1003 15:12:43.311858 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerName="extract-utilities" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.311930 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerName="extract-utilities" Oct 03 15:12:43 crc kubenswrapper[4636]: E1003 15:12:43.312026 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerName="registry-server" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.312124 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerName="registry-server" Oct 03 15:12:43 crc kubenswrapper[4636]: E1003 15:12:43.312214 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerName="registry-server" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.312281 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerName="registry-server" Oct 03 15:12:43 crc kubenswrapper[4636]: E1003 15:12:43.312355 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerName="extract-content" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.312424 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerName="extract-content" Oct 03 15:12:43 crc kubenswrapper[4636]: E1003 15:12:43.312502 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerName="extract-utilities" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.312568 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerName="extract-utilities" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.312852 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf3288e-2ccc-4249-afba-0976f0696f7d" containerName="registry-server" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.312962 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae522afa-cb70-408e-9025-1a668ffc23cb" containerName="registry-server" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.314699 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.336474 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hwpxp"] Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.482242 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-catalog-content\") pod \"community-operators-hwpxp\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.482633 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5srm\" (UniqueName: \"kubernetes.io/projected/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-kube-api-access-l5srm\") pod \"community-operators-hwpxp\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.482782 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-utilities\") pod \"community-operators-hwpxp\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.584215 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5srm\" (UniqueName: \"kubernetes.io/projected/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-kube-api-access-l5srm\") pod \"community-operators-hwpxp\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.584311 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-utilities\") pod \"community-operators-hwpxp\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.584411 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-catalog-content\") pod \"community-operators-hwpxp\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.584933 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-utilities\") pod \"community-operators-hwpxp\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.585014 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-catalog-content\") pod \"community-operators-hwpxp\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.615056 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5srm\" (UniqueName: \"kubernetes.io/projected/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-kube-api-access-l5srm\") pod \"community-operators-hwpxp\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:43 crc kubenswrapper[4636]: I1003 15:12:43.633890 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:44 crc kubenswrapper[4636]: I1003 15:12:44.247288 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hwpxp"] Oct 03 15:12:45 crc kubenswrapper[4636]: I1003 15:12:45.068245 4636 generic.go:334] "Generic (PLEG): container finished" podID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" containerID="b5b867a3dc01640c5dbaaa4ca54a49b65d0756195e0a6ef6eab1101dabf95bd6" exitCode=0 Oct 03 15:12:45 crc kubenswrapper[4636]: I1003 15:12:45.068810 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwpxp" event={"ID":"9a7b5f0c-5565-42d2-b431-9d175c75ee9d","Type":"ContainerDied","Data":"b5b867a3dc01640c5dbaaa4ca54a49b65d0756195e0a6ef6eab1101dabf95bd6"} Oct 03 15:12:45 crc kubenswrapper[4636]: I1003 15:12:45.069600 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwpxp" event={"ID":"9a7b5f0c-5565-42d2-b431-9d175c75ee9d","Type":"ContainerStarted","Data":"e0f4b6922e220820c56e189db081f9fce10a393c21463af07a2ea1c5fdaee143"} Oct 03 15:12:45 crc kubenswrapper[4636]: I1003 15:12:45.070599 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:12:47 crc kubenswrapper[4636]: I1003 15:12:47.105246 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwpxp" event={"ID":"9a7b5f0c-5565-42d2-b431-9d175c75ee9d","Type":"ContainerStarted","Data":"aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da"} Oct 03 15:12:48 crc kubenswrapper[4636]: I1003 15:12:48.114421 4636 generic.go:334] "Generic (PLEG): container finished" podID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" containerID="aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da" exitCode=0 Oct 03 15:12:48 crc kubenswrapper[4636]: I1003 15:12:48.114520 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwpxp" event={"ID":"9a7b5f0c-5565-42d2-b431-9d175c75ee9d","Type":"ContainerDied","Data":"aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da"} Oct 03 15:12:49 crc kubenswrapper[4636]: I1003 15:12:49.131607 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwpxp" event={"ID":"9a7b5f0c-5565-42d2-b431-9d175c75ee9d","Type":"ContainerStarted","Data":"8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1"} Oct 03 15:12:49 crc kubenswrapper[4636]: I1003 15:12:49.153459 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hwpxp" podStartSLOduration=2.6503681390000002 podStartE2EDuration="6.153443695s" podCreationTimestamp="2025-10-03 15:12:43 +0000 UTC" firstStartedPulling="2025-10-03 15:12:45.070377979 +0000 UTC m=+4314.929104226" lastFinishedPulling="2025-10-03 15:12:48.573453545 +0000 UTC m=+4318.432179782" observedRunningTime="2025-10-03 15:12:49.152048799 +0000 UTC m=+4319.010775046" watchObservedRunningTime="2025-10-03 15:12:49.153443695 +0000 UTC m=+4319.012169942" Oct 03 15:12:53 crc kubenswrapper[4636]: I1003 15:12:53.635545 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:53 crc kubenswrapper[4636]: I1003 15:12:53.636119 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:53 crc kubenswrapper[4636]: I1003 15:12:53.686989 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:53 crc kubenswrapper[4636]: I1003 15:12:53.793540 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:12:53 crc kubenswrapper[4636]: E1003 15:12:53.793809 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:12:54 crc kubenswrapper[4636]: I1003 15:12:54.220041 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:54 crc kubenswrapper[4636]: I1003 15:12:54.901941 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hwpxp"] Oct 03 15:12:56 crc kubenswrapper[4636]: I1003 15:12:56.185040 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hwpxp" podUID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" containerName="registry-server" containerID="cri-o://8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1" gracePeriod=2 Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.186082 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.196857 4636 generic.go:334] "Generic (PLEG): container finished" podID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" containerID="8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1" exitCode=0 Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.196906 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwpxp" event={"ID":"9a7b5f0c-5565-42d2-b431-9d175c75ee9d","Type":"ContainerDied","Data":"8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1"} Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.196916 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwpxp" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.196958 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwpxp" event={"ID":"9a7b5f0c-5565-42d2-b431-9d175c75ee9d","Type":"ContainerDied","Data":"e0f4b6922e220820c56e189db081f9fce10a393c21463af07a2ea1c5fdaee143"} Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.196980 4636 scope.go:117] "RemoveContainer" containerID="8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.228629 4636 scope.go:117] "RemoveContainer" containerID="aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.255644 4636 scope.go:117] "RemoveContainer" containerID="b5b867a3dc01640c5dbaaa4ca54a49b65d0756195e0a6ef6eab1101dabf95bd6" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.327875 4636 scope.go:117] "RemoveContainer" containerID="8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1" Oct 03 15:12:57 crc kubenswrapper[4636]: E1003 15:12:57.328261 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1\": container with ID starting with 8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1 not found: ID does not exist" containerID="8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.328288 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1"} err="failed to get container status \"8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1\": rpc error: code = NotFound desc = could not find container \"8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1\": container with ID starting with 8f0e551057f5603088def53678836a732cc4c37d2b42a4dd5d2ac9c100220cf1 not found: ID does not exist" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.328442 4636 scope.go:117] "RemoveContainer" containerID="aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da" Oct 03 15:12:57 crc kubenswrapper[4636]: E1003 15:12:57.328707 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da\": container with ID starting with aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da not found: ID does not exist" containerID="aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.328728 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da"} err="failed to get container status \"aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da\": rpc error: code = NotFound desc = could not find container \"aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da\": container with ID starting with aa0cac31924f1afd60c5e7b3c51a2b1692c1ca2ba53fed5e55756a76afa089da not found: ID does not exist" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.328741 4636 scope.go:117] "RemoveContainer" containerID="b5b867a3dc01640c5dbaaa4ca54a49b65d0756195e0a6ef6eab1101dabf95bd6" Oct 03 15:12:57 crc kubenswrapper[4636]: E1003 15:12:57.328963 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5b867a3dc01640c5dbaaa4ca54a49b65d0756195e0a6ef6eab1101dabf95bd6\": container with ID starting with b5b867a3dc01640c5dbaaa4ca54a49b65d0756195e0a6ef6eab1101dabf95bd6 not found: ID does not exist" containerID="b5b867a3dc01640c5dbaaa4ca54a49b65d0756195e0a6ef6eab1101dabf95bd6" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.328981 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b867a3dc01640c5dbaaa4ca54a49b65d0756195e0a6ef6eab1101dabf95bd6"} err="failed to get container status \"b5b867a3dc01640c5dbaaa4ca54a49b65d0756195e0a6ef6eab1101dabf95bd6\": rpc error: code = NotFound desc = could not find container \"b5b867a3dc01640c5dbaaa4ca54a49b65d0756195e0a6ef6eab1101dabf95bd6\": container with ID starting with b5b867a3dc01640c5dbaaa4ca54a49b65d0756195e0a6ef6eab1101dabf95bd6 not found: ID does not exist" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.330278 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5srm\" (UniqueName: \"kubernetes.io/projected/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-kube-api-access-l5srm\") pod \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.330556 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-catalog-content\") pod \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.330601 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-utilities\") pod \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\" (UID: \"9a7b5f0c-5565-42d2-b431-9d175c75ee9d\") " Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.331795 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-utilities" (OuterVolumeSpecName: "utilities") pod "9a7b5f0c-5565-42d2-b431-9d175c75ee9d" (UID: "9a7b5f0c-5565-42d2-b431-9d175c75ee9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.338724 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-kube-api-access-l5srm" (OuterVolumeSpecName: "kube-api-access-l5srm") pod "9a7b5f0c-5565-42d2-b431-9d175c75ee9d" (UID: "9a7b5f0c-5565-42d2-b431-9d175c75ee9d"). InnerVolumeSpecName "kube-api-access-l5srm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.381819 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a7b5f0c-5565-42d2-b431-9d175c75ee9d" (UID: "9a7b5f0c-5565-42d2-b431-9d175c75ee9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.432840 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.432869 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.432878 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5srm\" (UniqueName: \"kubernetes.io/projected/9a7b5f0c-5565-42d2-b431-9d175c75ee9d-kube-api-access-l5srm\") on node \"crc\" DevicePath \"\"" Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.540137 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hwpxp"] Oct 03 15:12:57 crc kubenswrapper[4636]: I1003 15:12:57.548870 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hwpxp"] Oct 03 15:12:58 crc kubenswrapper[4636]: I1003 15:12:58.803474 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" path="/var/lib/kubelet/pods/9a7b5f0c-5565-42d2-b431-9d175c75ee9d/volumes" Oct 03 15:13:07 crc kubenswrapper[4636]: I1003 15:13:07.794110 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:13:07 crc kubenswrapper[4636]: E1003 15:13:07.794706 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:13:19 crc kubenswrapper[4636]: I1003 15:13:19.794435 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:13:19 crc kubenswrapper[4636]: E1003 15:13:19.795212 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:13:31 crc kubenswrapper[4636]: I1003 15:13:31.793786 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:13:31 crc kubenswrapper[4636]: E1003 15:13:31.794505 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:13:46 crc kubenswrapper[4636]: I1003 15:13:46.794615 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:13:46 crc kubenswrapper[4636]: E1003 15:13:46.795919 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:13:59 crc kubenswrapper[4636]: I1003 15:13:59.796295 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:13:59 crc kubenswrapper[4636]: E1003 15:13:59.797701 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:14:11 crc kubenswrapper[4636]: I1003 15:14:11.794660 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:14:11 crc kubenswrapper[4636]: E1003 15:14:11.795517 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:14:22 crc kubenswrapper[4636]: I1003 15:14:22.793921 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:14:22 crc kubenswrapper[4636]: E1003 15:14:22.794787 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:14:35 crc kubenswrapper[4636]: I1003 15:14:35.794172 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:14:35 crc kubenswrapper[4636]: E1003 15:14:35.795027 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:14:46 crc kubenswrapper[4636]: I1003 15:14:46.795471 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:14:46 crc kubenswrapper[4636]: E1003 15:14:46.796558 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:14:59 crc kubenswrapper[4636]: I1003 15:14:59.793520 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:14:59 crc kubenswrapper[4636]: E1003 15:14:59.795291 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.144700 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796"] Oct 03 15:15:00 crc kubenswrapper[4636]: E1003 15:15:00.145193 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" containerName="extract-utilities" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.145211 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" containerName="extract-utilities" Oct 03 15:15:00 crc kubenswrapper[4636]: E1003 15:15:00.145229 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" containerName="registry-server" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.145237 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" containerName="registry-server" Oct 03 15:15:00 crc kubenswrapper[4636]: E1003 15:15:00.145267 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" containerName="extract-content" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.145274 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" containerName="extract-content" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.145491 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a7b5f0c-5565-42d2-b431-9d175c75ee9d" containerName="registry-server" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.146251 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.148089 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.150147 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.157197 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796"] Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.242229 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ecbeab0-6819-427c-90ba-d1377640e8ff-secret-volume\") pod \"collect-profiles-29325075-nr796\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.242584 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ecbeab0-6819-427c-90ba-d1377640e8ff-config-volume\") pod \"collect-profiles-29325075-nr796\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.242747 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52gsr\" (UniqueName: \"kubernetes.io/projected/6ecbeab0-6819-427c-90ba-d1377640e8ff-kube-api-access-52gsr\") pod \"collect-profiles-29325075-nr796\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.344006 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ecbeab0-6819-427c-90ba-d1377640e8ff-secret-volume\") pod \"collect-profiles-29325075-nr796\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.344084 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ecbeab0-6819-427c-90ba-d1377640e8ff-config-volume\") pod \"collect-profiles-29325075-nr796\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.344172 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52gsr\" (UniqueName: \"kubernetes.io/projected/6ecbeab0-6819-427c-90ba-d1377640e8ff-kube-api-access-52gsr\") pod \"collect-profiles-29325075-nr796\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.344922 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ecbeab0-6819-427c-90ba-d1377640e8ff-config-volume\") pod \"collect-profiles-29325075-nr796\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.396112 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ecbeab0-6819-427c-90ba-d1377640e8ff-secret-volume\") pod \"collect-profiles-29325075-nr796\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.397023 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52gsr\" (UniqueName: \"kubernetes.io/projected/6ecbeab0-6819-427c-90ba-d1377640e8ff-kube-api-access-52gsr\") pod \"collect-profiles-29325075-nr796\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.465438 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:00 crc kubenswrapper[4636]: I1003 15:15:00.916971 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796"] Oct 03 15:15:01 crc kubenswrapper[4636]: I1003 15:15:01.273653 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" event={"ID":"6ecbeab0-6819-427c-90ba-d1377640e8ff","Type":"ContainerStarted","Data":"fc9dbef992fa77363b38607f1b45eeca00fa1fbca213b1f80d7ef33d5f5c2122"} Oct 03 15:15:01 crc kubenswrapper[4636]: I1003 15:15:01.273923 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" event={"ID":"6ecbeab0-6819-427c-90ba-d1377640e8ff","Type":"ContainerStarted","Data":"3da634e94f0f0b4c385ce3f18edbf2a73d21e24a13db686c48700dac9c534acc"} Oct 03 15:15:01 crc kubenswrapper[4636]: I1003 15:15:01.294327 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" podStartSLOduration=1.294308861 podStartE2EDuration="1.294308861s" podCreationTimestamp="2025-10-03 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:15:01.286957437 +0000 UTC m=+4451.145683684" watchObservedRunningTime="2025-10-03 15:15:01.294308861 +0000 UTC m=+4451.153035108" Oct 03 15:15:03 crc kubenswrapper[4636]: I1003 15:15:03.293922 4636 generic.go:334] "Generic (PLEG): container finished" podID="6ecbeab0-6819-427c-90ba-d1377640e8ff" containerID="fc9dbef992fa77363b38607f1b45eeca00fa1fbca213b1f80d7ef33d5f5c2122" exitCode=0 Oct 03 15:15:03 crc kubenswrapper[4636]: I1003 15:15:03.294009 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" event={"ID":"6ecbeab0-6819-427c-90ba-d1377640e8ff","Type":"ContainerDied","Data":"fc9dbef992fa77363b38607f1b45eeca00fa1fbca213b1f80d7ef33d5f5c2122"} Oct 03 15:15:04 crc kubenswrapper[4636]: I1003 15:15:04.707104 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:04 crc kubenswrapper[4636]: I1003 15:15:04.824431 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ecbeab0-6819-427c-90ba-d1377640e8ff-config-volume\") pod \"6ecbeab0-6819-427c-90ba-d1377640e8ff\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " Oct 03 15:15:04 crc kubenswrapper[4636]: I1003 15:15:04.824498 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ecbeab0-6819-427c-90ba-d1377640e8ff-secret-volume\") pod \"6ecbeab0-6819-427c-90ba-d1377640e8ff\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " Oct 03 15:15:04 crc kubenswrapper[4636]: I1003 15:15:04.824652 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52gsr\" (UniqueName: \"kubernetes.io/projected/6ecbeab0-6819-427c-90ba-d1377640e8ff-kube-api-access-52gsr\") pod \"6ecbeab0-6819-427c-90ba-d1377640e8ff\" (UID: \"6ecbeab0-6819-427c-90ba-d1377640e8ff\") " Oct 03 15:15:04 crc kubenswrapper[4636]: I1003 15:15:04.825378 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ecbeab0-6819-427c-90ba-d1377640e8ff-config-volume" (OuterVolumeSpecName: "config-volume") pod "6ecbeab0-6819-427c-90ba-d1377640e8ff" (UID: "6ecbeab0-6819-427c-90ba-d1377640e8ff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:15:04 crc kubenswrapper[4636]: I1003 15:15:04.830364 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ecbeab0-6819-427c-90ba-d1377640e8ff-kube-api-access-52gsr" (OuterVolumeSpecName: "kube-api-access-52gsr") pod "6ecbeab0-6819-427c-90ba-d1377640e8ff" (UID: "6ecbeab0-6819-427c-90ba-d1377640e8ff"). InnerVolumeSpecName "kube-api-access-52gsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:15:04 crc kubenswrapper[4636]: I1003 15:15:04.830591 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ecbeab0-6819-427c-90ba-d1377640e8ff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6ecbeab0-6819-427c-90ba-d1377640e8ff" (UID: "6ecbeab0-6819-427c-90ba-d1377640e8ff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:04 crc kubenswrapper[4636]: I1003 15:15:04.927600 4636 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ecbeab0-6819-427c-90ba-d1377640e8ff-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:04 crc kubenswrapper[4636]: I1003 15:15:04.927637 4636 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ecbeab0-6819-427c-90ba-d1377640e8ff-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:04 crc kubenswrapper[4636]: I1003 15:15:04.927650 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52gsr\" (UniqueName: \"kubernetes.io/projected/6ecbeab0-6819-427c-90ba-d1377640e8ff-kube-api-access-52gsr\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:05 crc kubenswrapper[4636]: I1003 15:15:05.314011 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" event={"ID":"6ecbeab0-6819-427c-90ba-d1377640e8ff","Type":"ContainerDied","Data":"3da634e94f0f0b4c385ce3f18edbf2a73d21e24a13db686c48700dac9c534acc"} Oct 03 15:15:05 crc kubenswrapper[4636]: I1003 15:15:05.314531 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3da634e94f0f0b4c385ce3f18edbf2a73d21e24a13db686c48700dac9c534acc" Oct 03 15:15:05 crc kubenswrapper[4636]: I1003 15:15:05.314091 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325075-nr796" Oct 03 15:15:05 crc kubenswrapper[4636]: I1003 15:15:05.392983 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj"] Oct 03 15:15:05 crc kubenswrapper[4636]: I1003 15:15:05.402010 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325030-sk2kj"] Oct 03 15:15:06 crc kubenswrapper[4636]: I1003 15:15:06.807947 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6d7770-9370-4750-b396-038328ae41ef" path="/var/lib/kubelet/pods/2e6d7770-9370-4750-b396-038328ae41ef/volumes" Oct 03 15:15:10 crc kubenswrapper[4636]: I1003 15:15:10.801730 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:15:10 crc kubenswrapper[4636]: E1003 15:15:10.802270 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:15:10 crc kubenswrapper[4636]: I1003 15:15:10.864572 4636 scope.go:117] "RemoveContainer" containerID="bae38e860cb253b42df6dcc76be10af40d8122d58598cd3c13d22cf4590659ad" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.697658 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qqglg"] Oct 03 15:15:19 crc kubenswrapper[4636]: E1003 15:15:19.698660 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ecbeab0-6819-427c-90ba-d1377640e8ff" containerName="collect-profiles" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.698687 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ecbeab0-6819-427c-90ba-d1377640e8ff" containerName="collect-profiles" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.698879 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ecbeab0-6819-427c-90ba-d1377640e8ff" containerName="collect-profiles" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.700222 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.767356 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqglg"] Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.837229 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-utilities\") pod \"redhat-marketplace-qqglg\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.837309 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-catalog-content\") pod \"redhat-marketplace-qqglg\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.837601 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnzcp\" (UniqueName: \"kubernetes.io/projected/376c5493-93c1-4bed-91f6-f2d008ed1644-kube-api-access-vnzcp\") pod \"redhat-marketplace-qqglg\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.939254 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-utilities\") pod \"redhat-marketplace-qqglg\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.939335 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-catalog-content\") pod \"redhat-marketplace-qqglg\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.939469 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnzcp\" (UniqueName: \"kubernetes.io/projected/376c5493-93c1-4bed-91f6-f2d008ed1644-kube-api-access-vnzcp\") pod \"redhat-marketplace-qqglg\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.939687 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-utilities\") pod \"redhat-marketplace-qqglg\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.939823 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-catalog-content\") pod \"redhat-marketplace-qqglg\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:19 crc kubenswrapper[4636]: I1003 15:15:19.959489 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnzcp\" (UniqueName: \"kubernetes.io/projected/376c5493-93c1-4bed-91f6-f2d008ed1644-kube-api-access-vnzcp\") pod \"redhat-marketplace-qqglg\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:20 crc kubenswrapper[4636]: I1003 15:15:20.026598 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:20 crc kubenswrapper[4636]: I1003 15:15:20.469529 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqglg"] Oct 03 15:15:20 crc kubenswrapper[4636]: E1003 15:15:20.913435 4636 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod376c5493_93c1_4bed_91f6_f2d008ed1644.slice/crio-3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290.scope\": RecentStats: unable to find data in memory cache]" Oct 03 15:15:21 crc kubenswrapper[4636]: I1003 15:15:21.441865 4636 generic.go:334] "Generic (PLEG): container finished" podID="376c5493-93c1-4bed-91f6-f2d008ed1644" containerID="3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290" exitCode=0 Oct 03 15:15:21 crc kubenswrapper[4636]: I1003 15:15:21.441909 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqglg" event={"ID":"376c5493-93c1-4bed-91f6-f2d008ed1644","Type":"ContainerDied","Data":"3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290"} Oct 03 15:15:21 crc kubenswrapper[4636]: I1003 15:15:21.441963 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqglg" event={"ID":"376c5493-93c1-4bed-91f6-f2d008ed1644","Type":"ContainerStarted","Data":"e7f434bdf242d5c176b94290d47f2668e9d34270349b44d50e403b751cf49131"} Oct 03 15:15:22 crc kubenswrapper[4636]: I1003 15:15:22.455909 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqglg" event={"ID":"376c5493-93c1-4bed-91f6-f2d008ed1644","Type":"ContainerStarted","Data":"f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca"} Oct 03 15:15:22 crc kubenswrapper[4636]: I1003 15:15:22.794360 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:15:22 crc kubenswrapper[4636]: E1003 15:15:22.794702 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:15:23 crc kubenswrapper[4636]: I1003 15:15:23.466226 4636 generic.go:334] "Generic (PLEG): container finished" podID="376c5493-93c1-4bed-91f6-f2d008ed1644" containerID="f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca" exitCode=0 Oct 03 15:15:23 crc kubenswrapper[4636]: I1003 15:15:23.466307 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqglg" event={"ID":"376c5493-93c1-4bed-91f6-f2d008ed1644","Type":"ContainerDied","Data":"f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca"} Oct 03 15:15:23 crc kubenswrapper[4636]: I1003 15:15:23.471884 4636 generic.go:334] "Generic (PLEG): container finished" podID="76d391b3-cee3-4591-814b-a1b99bed1872" containerID="e5f489f65481472cf5eef3b21310941d6599e7df420d8d6207e2e83bd20d6cc6" exitCode=0 Oct 03 15:15:23 crc kubenswrapper[4636]: I1003 15:15:23.471919 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"76d391b3-cee3-4591-814b-a1b99bed1872","Type":"ContainerDied","Data":"e5f489f65481472cf5eef3b21310941d6599e7df420d8d6207e2e83bd20d6cc6"} Oct 03 15:15:24 crc kubenswrapper[4636]: I1003 15:15:24.483210 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqglg" event={"ID":"376c5493-93c1-4bed-91f6-f2d008ed1644","Type":"ContainerStarted","Data":"18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb"} Oct 03 15:15:24 crc kubenswrapper[4636]: I1003 15:15:24.506408 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qqglg" podStartSLOduration=2.953655 podStartE2EDuration="5.506383216s" podCreationTimestamp="2025-10-03 15:15:19 +0000 UTC" firstStartedPulling="2025-10-03 15:15:21.444185497 +0000 UTC m=+4471.302911754" lastFinishedPulling="2025-10-03 15:15:23.996913733 +0000 UTC m=+4473.855639970" observedRunningTime="2025-10-03 15:15:24.503413878 +0000 UTC m=+4474.362140165" watchObservedRunningTime="2025-10-03 15:15:24.506383216 +0000 UTC m=+4474.365109463" Oct 03 15:15:24 crc kubenswrapper[4636]: I1003 15:15:24.924499 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.028915 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config\") pod \"76d391b3-cee3-4591-814b-a1b99bed1872\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.028950 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-config-data\") pod \"76d391b3-cee3-4591-814b-a1b99bed1872\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.028985 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4zhp\" (UniqueName: \"kubernetes.io/projected/76d391b3-cee3-4591-814b-a1b99bed1872-kube-api-access-d4zhp\") pod \"76d391b3-cee3-4591-814b-a1b99bed1872\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.029017 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"76d391b3-cee3-4591-814b-a1b99bed1872\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.029072 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ca-certs\") pod \"76d391b3-cee3-4591-814b-a1b99bed1872\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.029124 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-workdir\") pod \"76d391b3-cee3-4591-814b-a1b99bed1872\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.029144 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ssh-key\") pod \"76d391b3-cee3-4591-814b-a1b99bed1872\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.029227 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-temporary\") pod \"76d391b3-cee3-4591-814b-a1b99bed1872\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.029287 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config-secret\") pod \"76d391b3-cee3-4591-814b-a1b99bed1872\" (UID: \"76d391b3-cee3-4591-814b-a1b99bed1872\") " Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.030635 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "76d391b3-cee3-4591-814b-a1b99bed1872" (UID: "76d391b3-cee3-4591-814b-a1b99bed1872"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.034500 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "76d391b3-cee3-4591-814b-a1b99bed1872" (UID: "76d391b3-cee3-4591-814b-a1b99bed1872"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.035590 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "76d391b3-cee3-4591-814b-a1b99bed1872" (UID: "76d391b3-cee3-4591-814b-a1b99bed1872"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.035740 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-config-data" (OuterVolumeSpecName: "config-data") pod "76d391b3-cee3-4591-814b-a1b99bed1872" (UID: "76d391b3-cee3-4591-814b-a1b99bed1872"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.047419 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d391b3-cee3-4591-814b-a1b99bed1872-kube-api-access-d4zhp" (OuterVolumeSpecName: "kube-api-access-d4zhp") pod "76d391b3-cee3-4591-814b-a1b99bed1872" (UID: "76d391b3-cee3-4591-814b-a1b99bed1872"). InnerVolumeSpecName "kube-api-access-d4zhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.058495 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "76d391b3-cee3-4591-814b-a1b99bed1872" (UID: "76d391b3-cee3-4591-814b-a1b99bed1872"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.059120 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "76d391b3-cee3-4591-814b-a1b99bed1872" (UID: "76d391b3-cee3-4591-814b-a1b99bed1872"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.059499 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "76d391b3-cee3-4591-814b-a1b99bed1872" (UID: "76d391b3-cee3-4591-814b-a1b99bed1872"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.083385 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "76d391b3-cee3-4591-814b-a1b99bed1872" (UID: "76d391b3-cee3-4591-814b-a1b99bed1872"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.132149 4636 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.132190 4636 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76d391b3-cee3-4591-814b-a1b99bed1872-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.132204 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4zhp\" (UniqueName: \"kubernetes.io/projected/76d391b3-cee3-4591-814b-a1b99bed1872-kube-api-access-d4zhp\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.142053 4636 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.142088 4636 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.142121 4636 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.142136 4636 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.142151 4636 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/76d391b3-cee3-4591-814b-a1b99bed1872-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.142167 4636 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/76d391b3-cee3-4591-814b-a1b99bed1872-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.171592 4636 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.243819 4636 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.493084 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"76d391b3-cee3-4591-814b-a1b99bed1872","Type":"ContainerDied","Data":"b27fb94c56f2a2c3b5bd262a8bf534138a99a0962c3eb32c0a66e87371ac8aab"} Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.493170 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27fb94c56f2a2c3b5bd262a8bf534138a99a0962c3eb32c0a66e87371ac8aab" Oct 03 15:15:25 crc kubenswrapper[4636]: I1003 15:15:25.493237 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 15:15:30 crc kubenswrapper[4636]: I1003 15:15:30.027345 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:30 crc kubenswrapper[4636]: I1003 15:15:30.027660 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:30 crc kubenswrapper[4636]: I1003 15:15:30.095381 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:30 crc kubenswrapper[4636]: I1003 15:15:30.582755 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:30 crc kubenswrapper[4636]: I1003 15:15:30.631272 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqglg"] Oct 03 15:15:32 crc kubenswrapper[4636]: I1003 15:15:32.557612 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qqglg" podUID="376c5493-93c1-4bed-91f6-f2d008ed1644" containerName="registry-server" containerID="cri-o://18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb" gracePeriod=2 Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.099189 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.184629 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-utilities\") pod \"376c5493-93c1-4bed-91f6-f2d008ed1644\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.184766 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnzcp\" (UniqueName: \"kubernetes.io/projected/376c5493-93c1-4bed-91f6-f2d008ed1644-kube-api-access-vnzcp\") pod \"376c5493-93c1-4bed-91f6-f2d008ed1644\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.184795 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-catalog-content\") pod \"376c5493-93c1-4bed-91f6-f2d008ed1644\" (UID: \"376c5493-93c1-4bed-91f6-f2d008ed1644\") " Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.185769 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-utilities" (OuterVolumeSpecName: "utilities") pod "376c5493-93c1-4bed-91f6-f2d008ed1644" (UID: "376c5493-93c1-4bed-91f6-f2d008ed1644"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.200528 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "376c5493-93c1-4bed-91f6-f2d008ed1644" (UID: "376c5493-93c1-4bed-91f6-f2d008ed1644"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.287638 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.287671 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/376c5493-93c1-4bed-91f6-f2d008ed1644-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.288272 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/376c5493-93c1-4bed-91f6-f2d008ed1644-kube-api-access-vnzcp" (OuterVolumeSpecName: "kube-api-access-vnzcp") pod "376c5493-93c1-4bed-91f6-f2d008ed1644" (UID: "376c5493-93c1-4bed-91f6-f2d008ed1644"). InnerVolumeSpecName "kube-api-access-vnzcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.389798 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnzcp\" (UniqueName: \"kubernetes.io/projected/376c5493-93c1-4bed-91f6-f2d008ed1644-kube-api-access-vnzcp\") on node \"crc\" DevicePath \"\"" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.567697 4636 generic.go:334] "Generic (PLEG): container finished" podID="376c5493-93c1-4bed-91f6-f2d008ed1644" containerID="18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb" exitCode=0 Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.567732 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqglg" event={"ID":"376c5493-93c1-4bed-91f6-f2d008ed1644","Type":"ContainerDied","Data":"18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb"} Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.567773 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqglg" event={"ID":"376c5493-93c1-4bed-91f6-f2d008ed1644","Type":"ContainerDied","Data":"e7f434bdf242d5c176b94290d47f2668e9d34270349b44d50e403b751cf49131"} Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.567790 4636 scope.go:117] "RemoveContainer" containerID="18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.567827 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqglg" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.592619 4636 scope.go:117] "RemoveContainer" containerID="f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.629794 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqglg"] Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.630491 4636 scope.go:117] "RemoveContainer" containerID="3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.639320 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqglg"] Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.670249 4636 scope.go:117] "RemoveContainer" containerID="18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb" Oct 03 15:15:33 crc kubenswrapper[4636]: E1003 15:15:33.673154 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb\": container with ID starting with 18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb not found: ID does not exist" containerID="18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.673204 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb"} err="failed to get container status \"18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb\": rpc error: code = NotFound desc = could not find container \"18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb\": container with ID starting with 18004e9676f89dd4a3917b6c1ad1b1051bfc9d5d5760ff98cc916d7f87039bcb not found: ID does not exist" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.673239 4636 scope.go:117] "RemoveContainer" containerID="f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca" Oct 03 15:15:33 crc kubenswrapper[4636]: E1003 15:15:33.673689 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca\": container with ID starting with f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca not found: ID does not exist" containerID="f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.673721 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca"} err="failed to get container status \"f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca\": rpc error: code = NotFound desc = could not find container \"f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca\": container with ID starting with f472a45de183961c481492e5accf7b64b3e2c5f0861ffc9764f5efddfdcc98ca not found: ID does not exist" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.673740 4636 scope.go:117] "RemoveContainer" containerID="3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290" Oct 03 15:15:33 crc kubenswrapper[4636]: E1003 15:15:33.674266 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290\": container with ID starting with 3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290 not found: ID does not exist" containerID="3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.674287 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290"} err="failed to get container status \"3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290\": rpc error: code = NotFound desc = could not find container \"3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290\": container with ID starting with 3e788ea56fb6a2889b0136e0e1b9757f993526fec33bc98438f8ead39624f290 not found: ID does not exist" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.906843 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 15:15:33 crc kubenswrapper[4636]: E1003 15:15:33.907312 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376c5493-93c1-4bed-91f6-f2d008ed1644" containerName="extract-content" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.907336 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="376c5493-93c1-4bed-91f6-f2d008ed1644" containerName="extract-content" Oct 03 15:15:33 crc kubenswrapper[4636]: E1003 15:15:33.907370 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376c5493-93c1-4bed-91f6-f2d008ed1644" containerName="registry-server" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.907379 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="376c5493-93c1-4bed-91f6-f2d008ed1644" containerName="registry-server" Oct 03 15:15:33 crc kubenswrapper[4636]: E1003 15:15:33.907398 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d391b3-cee3-4591-814b-a1b99bed1872" containerName="tempest-tests-tempest-tests-runner" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.907409 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d391b3-cee3-4591-814b-a1b99bed1872" containerName="tempest-tests-tempest-tests-runner" Oct 03 15:15:33 crc kubenswrapper[4636]: E1003 15:15:33.907424 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376c5493-93c1-4bed-91f6-f2d008ed1644" containerName="extract-utilities" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.907431 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="376c5493-93c1-4bed-91f6-f2d008ed1644" containerName="extract-utilities" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.907663 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d391b3-cee3-4591-814b-a1b99bed1872" containerName="tempest-tests-tempest-tests-runner" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.907697 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="376c5493-93c1-4bed-91f6-f2d008ed1644" containerName="registry-server" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.908477 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.910466 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-trj5l" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.916764 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.999103 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d141c423-495c-4fa0-af39-06bd5c484253\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:15:33 crc kubenswrapper[4636]: I1003 15:15:33.999488 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn2zz\" (UniqueName: \"kubernetes.io/projected/d141c423-495c-4fa0-af39-06bd5c484253-kube-api-access-xn2zz\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d141c423-495c-4fa0-af39-06bd5c484253\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:15:34 crc kubenswrapper[4636]: I1003 15:15:34.100953 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d141c423-495c-4fa0-af39-06bd5c484253\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:15:34 crc kubenswrapper[4636]: I1003 15:15:34.101063 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn2zz\" (UniqueName: \"kubernetes.io/projected/d141c423-495c-4fa0-af39-06bd5c484253-kube-api-access-xn2zz\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d141c423-495c-4fa0-af39-06bd5c484253\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:15:34 crc kubenswrapper[4636]: I1003 15:15:34.103411 4636 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d141c423-495c-4fa0-af39-06bd5c484253\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:15:34 crc kubenswrapper[4636]: I1003 15:15:34.290434 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn2zz\" (UniqueName: \"kubernetes.io/projected/d141c423-495c-4fa0-af39-06bd5c484253-kube-api-access-xn2zz\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d141c423-495c-4fa0-af39-06bd5c484253\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:15:34 crc kubenswrapper[4636]: I1003 15:15:34.315210 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d141c423-495c-4fa0-af39-06bd5c484253\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:15:34 crc kubenswrapper[4636]: I1003 15:15:34.522688 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 15:15:34 crc kubenswrapper[4636]: I1003 15:15:34.803545 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="376c5493-93c1-4bed-91f6-f2d008ed1644" path="/var/lib/kubelet/pods/376c5493-93c1-4bed-91f6-f2d008ed1644/volumes" Oct 03 15:15:35 crc kubenswrapper[4636]: I1003 15:15:35.114701 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 15:15:35 crc kubenswrapper[4636]: I1003 15:15:35.595070 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d141c423-495c-4fa0-af39-06bd5c484253","Type":"ContainerStarted","Data":"c035db3b8890af494b06fc070ad6ed8301e821973d14a311473351d33c5a1d50"} Oct 03 15:15:36 crc kubenswrapper[4636]: I1003 15:15:36.627416 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d141c423-495c-4fa0-af39-06bd5c484253","Type":"ContainerStarted","Data":"c335131a85956441ea647427b6bd8fdcd6da151a888d2d4199b8b72385a9ab20"} Oct 03 15:15:36 crc kubenswrapper[4636]: I1003 15:15:36.653097 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.393335303 podStartE2EDuration="3.653074534s" podCreationTimestamp="2025-10-03 15:15:33 +0000 UTC" firstStartedPulling="2025-10-03 15:15:35.109351431 +0000 UTC m=+4484.968077678" lastFinishedPulling="2025-10-03 15:15:36.369090662 +0000 UTC m=+4486.227816909" observedRunningTime="2025-10-03 15:15:36.642503695 +0000 UTC m=+4486.501229942" watchObservedRunningTime="2025-10-03 15:15:36.653074534 +0000 UTC m=+4486.511800781" Oct 03 15:15:37 crc kubenswrapper[4636]: I1003 15:15:37.794031 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:15:37 crc kubenswrapper[4636]: E1003 15:15:37.794307 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:15:51 crc kubenswrapper[4636]: I1003 15:15:51.793882 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:15:51 crc kubenswrapper[4636]: E1003 15:15:51.794611 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.310051 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r2hg7/must-gather-wd58h"] Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.312061 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/must-gather-wd58h" Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.316824 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-r2hg7"/"default-dockercfg-gktkl" Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.317015 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r2hg7"/"openshift-service-ca.crt" Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.317163 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r2hg7"/"kube-root-ca.crt" Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.334063 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r2hg7/must-gather-wd58h"] Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.488370 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfp45\" (UniqueName: \"kubernetes.io/projected/df99b419-f085-4d93-9a9b-c68e56153aeb-kube-api-access-lfp45\") pod \"must-gather-wd58h\" (UID: \"df99b419-f085-4d93-9a9b-c68e56153aeb\") " pod="openshift-must-gather-r2hg7/must-gather-wd58h" Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.488599 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/df99b419-f085-4d93-9a9b-c68e56153aeb-must-gather-output\") pod \"must-gather-wd58h\" (UID: \"df99b419-f085-4d93-9a9b-c68e56153aeb\") " pod="openshift-must-gather-r2hg7/must-gather-wd58h" Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.590914 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/df99b419-f085-4d93-9a9b-c68e56153aeb-must-gather-output\") pod \"must-gather-wd58h\" (UID: \"df99b419-f085-4d93-9a9b-c68e56153aeb\") " pod="openshift-must-gather-r2hg7/must-gather-wd58h" Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.591168 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfp45\" (UniqueName: \"kubernetes.io/projected/df99b419-f085-4d93-9a9b-c68e56153aeb-kube-api-access-lfp45\") pod \"must-gather-wd58h\" (UID: \"df99b419-f085-4d93-9a9b-c68e56153aeb\") " pod="openshift-must-gather-r2hg7/must-gather-wd58h" Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.591787 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/df99b419-f085-4d93-9a9b-c68e56153aeb-must-gather-output\") pod \"must-gather-wd58h\" (UID: \"df99b419-f085-4d93-9a9b-c68e56153aeb\") " pod="openshift-must-gather-r2hg7/must-gather-wd58h" Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.614753 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfp45\" (UniqueName: \"kubernetes.io/projected/df99b419-f085-4d93-9a9b-c68e56153aeb-kube-api-access-lfp45\") pod \"must-gather-wd58h\" (UID: \"df99b419-f085-4d93-9a9b-c68e56153aeb\") " pod="openshift-must-gather-r2hg7/must-gather-wd58h" Oct 03 15:15:55 crc kubenswrapper[4636]: I1003 15:15:55.639059 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/must-gather-wd58h" Oct 03 15:15:56 crc kubenswrapper[4636]: I1003 15:15:56.092530 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r2hg7/must-gather-wd58h"] Oct 03 15:15:56 crc kubenswrapper[4636]: W1003 15:15:56.096512 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf99b419_f085_4d93_9a9b_c68e56153aeb.slice/crio-8ea5920d49bb8dd0761280403248f179c6e7cc222de6a348885a810d912cd48e WatchSource:0}: Error finding container 8ea5920d49bb8dd0761280403248f179c6e7cc222de6a348885a810d912cd48e: Status 404 returned error can't find the container with id 8ea5920d49bb8dd0761280403248f179c6e7cc222de6a348885a810d912cd48e Oct 03 15:15:56 crc kubenswrapper[4636]: I1003 15:15:56.802370 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/must-gather-wd58h" event={"ID":"df99b419-f085-4d93-9a9b-c68e56153aeb","Type":"ContainerStarted","Data":"8ea5920d49bb8dd0761280403248f179c6e7cc222de6a348885a810d912cd48e"} Oct 03 15:16:00 crc kubenswrapper[4636]: I1003 15:16:00.835497 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/must-gather-wd58h" event={"ID":"df99b419-f085-4d93-9a9b-c68e56153aeb","Type":"ContainerStarted","Data":"5c1727e3cadf5d87e4029a8b2a00b0a4b180bed513c73a55840772a7137dfeb7"} Oct 03 15:16:01 crc kubenswrapper[4636]: I1003 15:16:01.844208 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/must-gather-wd58h" event={"ID":"df99b419-f085-4d93-9a9b-c68e56153aeb","Type":"ContainerStarted","Data":"163e2f5889616e6ae9ca30297e0266fa0c20a71bdf7b7fec91cf28eac1f88990"} Oct 03 15:16:01 crc kubenswrapper[4636]: I1003 15:16:01.858861 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r2hg7/must-gather-wd58h" podStartSLOduration=2.586523584 podStartE2EDuration="6.858844486s" podCreationTimestamp="2025-10-03 15:15:55 +0000 UTC" firstStartedPulling="2025-10-03 15:15:56.098336115 +0000 UTC m=+4505.957062362" lastFinishedPulling="2025-10-03 15:16:00.370657017 +0000 UTC m=+4510.229383264" observedRunningTime="2025-10-03 15:16:01.85633186 +0000 UTC m=+4511.715058107" watchObservedRunningTime="2025-10-03 15:16:01.858844486 +0000 UTC m=+4511.717570733" Oct 03 15:16:05 crc kubenswrapper[4636]: I1003 15:16:05.514730 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r2hg7/crc-debug-72bnq"] Oct 03 15:16:05 crc kubenswrapper[4636]: I1003 15:16:05.516378 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-72bnq" Oct 03 15:16:05 crc kubenswrapper[4636]: I1003 15:16:05.579320 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/450fc841-5bdf-47f3-8c20-6b70396e445f-host\") pod \"crc-debug-72bnq\" (UID: \"450fc841-5bdf-47f3-8c20-6b70396e445f\") " pod="openshift-must-gather-r2hg7/crc-debug-72bnq" Oct 03 15:16:05 crc kubenswrapper[4636]: I1003 15:16:05.579676 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88d6x\" (UniqueName: \"kubernetes.io/projected/450fc841-5bdf-47f3-8c20-6b70396e445f-kube-api-access-88d6x\") pod \"crc-debug-72bnq\" (UID: \"450fc841-5bdf-47f3-8c20-6b70396e445f\") " pod="openshift-must-gather-r2hg7/crc-debug-72bnq" Oct 03 15:16:05 crc kubenswrapper[4636]: I1003 15:16:05.681212 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88d6x\" (UniqueName: \"kubernetes.io/projected/450fc841-5bdf-47f3-8c20-6b70396e445f-kube-api-access-88d6x\") pod \"crc-debug-72bnq\" (UID: \"450fc841-5bdf-47f3-8c20-6b70396e445f\") " pod="openshift-must-gather-r2hg7/crc-debug-72bnq" Oct 03 15:16:05 crc kubenswrapper[4636]: I1003 15:16:05.681537 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/450fc841-5bdf-47f3-8c20-6b70396e445f-host\") pod \"crc-debug-72bnq\" (UID: \"450fc841-5bdf-47f3-8c20-6b70396e445f\") " pod="openshift-must-gather-r2hg7/crc-debug-72bnq" Oct 03 15:16:05 crc kubenswrapper[4636]: I1003 15:16:05.681675 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/450fc841-5bdf-47f3-8c20-6b70396e445f-host\") pod \"crc-debug-72bnq\" (UID: \"450fc841-5bdf-47f3-8c20-6b70396e445f\") " pod="openshift-must-gather-r2hg7/crc-debug-72bnq" Oct 03 15:16:05 crc kubenswrapper[4636]: I1003 15:16:05.710227 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88d6x\" (UniqueName: \"kubernetes.io/projected/450fc841-5bdf-47f3-8c20-6b70396e445f-kube-api-access-88d6x\") pod \"crc-debug-72bnq\" (UID: \"450fc841-5bdf-47f3-8c20-6b70396e445f\") " pod="openshift-must-gather-r2hg7/crc-debug-72bnq" Oct 03 15:16:05 crc kubenswrapper[4636]: I1003 15:16:05.833302 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-72bnq" Oct 03 15:16:05 crc kubenswrapper[4636]: W1003 15:16:05.865087 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod450fc841_5bdf_47f3_8c20_6b70396e445f.slice/crio-0ec3f0c460d61ef8424a489909e1e2b53267b7d972b94827ed02ef1841ad081c WatchSource:0}: Error finding container 0ec3f0c460d61ef8424a489909e1e2b53267b7d972b94827ed02ef1841ad081c: Status 404 returned error can't find the container with id 0ec3f0c460d61ef8424a489909e1e2b53267b7d972b94827ed02ef1841ad081c Oct 03 15:16:06 crc kubenswrapper[4636]: I1003 15:16:06.794928 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:16:06 crc kubenswrapper[4636]: E1003 15:16:06.795413 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:16:06 crc kubenswrapper[4636]: I1003 15:16:06.883771 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/crc-debug-72bnq" event={"ID":"450fc841-5bdf-47f3-8c20-6b70396e445f","Type":"ContainerStarted","Data":"0ec3f0c460d61ef8424a489909e1e2b53267b7d972b94827ed02ef1841ad081c"} Oct 03 15:16:18 crc kubenswrapper[4636]: I1003 15:16:18.794156 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:16:18 crc kubenswrapper[4636]: E1003 15:16:18.795061 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:16:20 crc kubenswrapper[4636]: I1003 15:16:20.016655 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/crc-debug-72bnq" event={"ID":"450fc841-5bdf-47f3-8c20-6b70396e445f","Type":"ContainerStarted","Data":"3e567e1edf4c7ea08dbdc8f68993bf0900be4dd3b9ce68e257ebf3f3bc99c27f"} Oct 03 15:16:20 crc kubenswrapper[4636]: I1003 15:16:20.035783 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r2hg7/crc-debug-72bnq" podStartSLOduration=2.013747171 podStartE2EDuration="15.035766042s" podCreationTimestamp="2025-10-03 15:16:05 +0000 UTC" firstStartedPulling="2025-10-03 15:16:05.880949435 +0000 UTC m=+4515.739675682" lastFinishedPulling="2025-10-03 15:16:18.902968306 +0000 UTC m=+4528.761694553" observedRunningTime="2025-10-03 15:16:20.031256514 +0000 UTC m=+4529.889982761" watchObservedRunningTime="2025-10-03 15:16:20.035766042 +0000 UTC m=+4529.894492289" Oct 03 15:16:33 crc kubenswrapper[4636]: I1003 15:16:33.793914 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:16:33 crc kubenswrapper[4636]: E1003 15:16:33.794593 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:16:44 crc kubenswrapper[4636]: I1003 15:16:44.800762 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:16:44 crc kubenswrapper[4636]: E1003 15:16:44.802360 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:16:56 crc kubenswrapper[4636]: I1003 15:16:56.795720 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:16:56 crc kubenswrapper[4636]: E1003 15:16:56.796305 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:17:07 crc kubenswrapper[4636]: I1003 15:17:07.794384 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:17:07 crc kubenswrapper[4636]: E1003 15:17:07.795154 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:17:19 crc kubenswrapper[4636]: I1003 15:17:19.796992 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:17:19 crc kubenswrapper[4636]: E1003 15:17:19.797823 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:17:34 crc kubenswrapper[4636]: I1003 15:17:34.345639 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f548d674d-2q8gg_ee5faec7-3829-49b6-aca7-452f5eae6a67/barbican-api-log/0.log" Oct 03 15:17:34 crc kubenswrapper[4636]: I1003 15:17:34.361197 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f548d674d-2q8gg_ee5faec7-3829-49b6-aca7-452f5eae6a67/barbican-api/0.log" Oct 03 15:17:34 crc kubenswrapper[4636]: I1003 15:17:34.572326 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f6878bdf6-2vf98_9cf408a6-c7e6-4bf3-80c6-47cc10bec465/barbican-keystone-listener/0.log" Oct 03 15:17:34 crc kubenswrapper[4636]: I1003 15:17:34.658138 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f6878bdf6-2vf98_9cf408a6-c7e6-4bf3-80c6-47cc10bec465/barbican-keystone-listener-log/0.log" Oct 03 15:17:34 crc kubenswrapper[4636]: I1003 15:17:34.795218 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:17:34 crc kubenswrapper[4636]: E1003 15:17:34.795638 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:17:34 crc kubenswrapper[4636]: I1003 15:17:34.868939 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f76699687-g9k2g_596e3078-e359-4e8d-a7c0-74c710f2c2f9/barbican-worker/0.log" Oct 03 15:17:34 crc kubenswrapper[4636]: I1003 15:17:34.870067 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f76699687-g9k2g_596e3078-e359-4e8d-a7c0-74c710f2c2f9/barbican-worker-log/0.log" Oct 03 15:17:35 crc kubenswrapper[4636]: I1003 15:17:35.107643 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr_57d50548-733b-4696-9e0f-fc749406a055/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:35 crc kubenswrapper[4636]: I1003 15:17:35.318018 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e/ceilometer-central-agent/0.log" Oct 03 15:17:35 crc kubenswrapper[4636]: I1003 15:17:35.372781 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e/ceilometer-notification-agent/0.log" Oct 03 15:17:35 crc kubenswrapper[4636]: I1003 15:17:35.393814 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e/proxy-httpd/0.log" Oct 03 15:17:35 crc kubenswrapper[4636]: I1003 15:17:35.647749 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e/sg-core/0.log" Oct 03 15:17:35 crc kubenswrapper[4636]: I1003 15:17:35.714807 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7a3dbfb9-f2b2-4725-9960-07d3fb89125e/cinder-api/0.log" Oct 03 15:17:35 crc kubenswrapper[4636]: I1003 15:17:35.861558 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7a3dbfb9-f2b2-4725-9960-07d3fb89125e/cinder-api-log/0.log" Oct 03 15:17:35 crc kubenswrapper[4636]: I1003 15:17:35.917344 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_94180bad-9d72-4d67-aefa-1fd7a9d886ac/cinder-scheduler/0.log" Oct 03 15:17:36 crc kubenswrapper[4636]: I1003 15:17:36.068375 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_94180bad-9d72-4d67-aefa-1fd7a9d886ac/probe/0.log" Oct 03 15:17:36 crc kubenswrapper[4636]: I1003 15:17:36.187255 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-qplrd_9781ac24-d39e-4e00-b2e8-3eac5f120090/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:36 crc kubenswrapper[4636]: I1003 15:17:36.424516 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kn84r_ca12d2cd-3187-4910-9e28-2f977be4bcf8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:36 crc kubenswrapper[4636]: I1003 15:17:36.593229 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk_e872c241-3445-4382-a7f0-1a15d6a223c2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:36 crc kubenswrapper[4636]: I1003 15:17:36.769169 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d7677974f-dtkft_317017e9-687f-4a84-b896-fab84c269e2b/init/0.log" Oct 03 15:17:36 crc kubenswrapper[4636]: I1003 15:17:36.915444 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d7677974f-dtkft_317017e9-687f-4a84-b896-fab84c269e2b/init/0.log" Oct 03 15:17:37 crc kubenswrapper[4636]: I1003 15:17:37.071677 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d7677974f-dtkft_317017e9-687f-4a84-b896-fab84c269e2b/dnsmasq-dns/0.log" Oct 03 15:17:37 crc kubenswrapper[4636]: I1003 15:17:37.211393 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-f99bb_a1c24630-7d57-45b9-8bdd-fb45d6a74c61/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:37 crc kubenswrapper[4636]: I1003 15:17:37.284623 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_48268aa0-45d6-42d4-a902-6f9221eae8d7/glance-httpd/0.log" Oct 03 15:17:37 crc kubenswrapper[4636]: I1003 15:17:37.468586 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_48268aa0-45d6-42d4-a902-6f9221eae8d7/glance-log/0.log" Oct 03 15:17:37 crc kubenswrapper[4636]: I1003 15:17:37.567249 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4/glance-httpd/0.log" Oct 03 15:17:37 crc kubenswrapper[4636]: I1003 15:17:37.853484 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4/glance-log/0.log" Oct 03 15:17:37 crc kubenswrapper[4636]: I1003 15:17:37.864455 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c5bc9456-rfvns_0025da7c-17f3-4036-a9fc-3330508c11cd/horizon/1.log" Oct 03 15:17:38 crc kubenswrapper[4636]: I1003 15:17:38.133169 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c5bc9456-rfvns_0025da7c-17f3-4036-a9fc-3330508c11cd/horizon/0.log" Oct 03 15:17:38 crc kubenswrapper[4636]: I1003 15:17:38.465407 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c5bc9456-rfvns_0025da7c-17f3-4036-a9fc-3330508c11cd/horizon-log/0.log" Oct 03 15:17:38 crc kubenswrapper[4636]: I1003 15:17:38.496261 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-rldfm_9634671b-cf60-4cdf-9558-417432ff5401/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.158332 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-r9sdw_52b89ba7-3476-42ae-aa47-fb7a38732669/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.390014 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29325061-4mrxv_e2f61a03-c4e7-414d-b6f9-b1f920d35757/keystone-cron/0.log" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.417561 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0/kube-state-metrics/0.log" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.417817 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7b4f64b6bf-z54p6_5ddc1097-69d8-4db3-93f1-a43038191aae/keystone-api/0.log" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.622901 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk_917285a7-3281-4326-8837-f1db2fe9a711/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.813576 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xrzrx"] Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.820542 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.835291 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-catalog-content\") pod \"certified-operators-xrzrx\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.835399 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9h4f\" (UniqueName: \"kubernetes.io/projected/37e2d0a1-58e7-487a-841a-53f45820362d-kube-api-access-w9h4f\") pod \"certified-operators-xrzrx\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.835564 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-utilities\") pod \"certified-operators-xrzrx\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.841242 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xrzrx"] Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.936729 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-utilities\") pod \"certified-operators-xrzrx\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.936801 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-catalog-content\") pod \"certified-operators-xrzrx\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.936862 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9h4f\" (UniqueName: \"kubernetes.io/projected/37e2d0a1-58e7-487a-841a-53f45820362d-kube-api-access-w9h4f\") pod \"certified-operators-xrzrx\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.937565 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-utilities\") pod \"certified-operators-xrzrx\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:39 crc kubenswrapper[4636]: I1003 15:17:39.937785 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-catalog-content\") pod \"certified-operators-xrzrx\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:40 crc kubenswrapper[4636]: I1003 15:17:40.402950 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9h4f\" (UniqueName: \"kubernetes.io/projected/37e2d0a1-58e7-487a-841a-53f45820362d-kube-api-access-w9h4f\") pod \"certified-operators-xrzrx\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:40 crc kubenswrapper[4636]: I1003 15:17:40.451591 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:40 crc kubenswrapper[4636]: I1003 15:17:40.462173 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d7d56d58f-cswwm_58fac2cb-4974-4241-8a11-77ad13d22306/neutron-httpd/0.log" Oct 03 15:17:40 crc kubenswrapper[4636]: I1003 15:17:40.709901 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d7d56d58f-cswwm_58fac2cb-4974-4241-8a11-77ad13d22306/neutron-api/0.log" Oct 03 15:17:41 crc kubenswrapper[4636]: I1003 15:17:41.066785 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2_4932588e-72ae-44a2-bc95-08cd792a140f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:41 crc kubenswrapper[4636]: I1003 15:17:41.241214 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xrzrx"] Oct 03 15:17:41 crc kubenswrapper[4636]: I1003 15:17:41.767800 4636 generic.go:334] "Generic (PLEG): container finished" podID="37e2d0a1-58e7-487a-841a-53f45820362d" containerID="ec894dc15539324bb37792a1561adead07c0254abba7ab38387135cf83bef38f" exitCode=0 Oct 03 15:17:41 crc kubenswrapper[4636]: I1003 15:17:41.768076 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrzrx" event={"ID":"37e2d0a1-58e7-487a-841a-53f45820362d","Type":"ContainerDied","Data":"ec894dc15539324bb37792a1561adead07c0254abba7ab38387135cf83bef38f"} Oct 03 15:17:41 crc kubenswrapper[4636]: I1003 15:17:41.768136 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrzrx" event={"ID":"37e2d0a1-58e7-487a-841a-53f45820362d","Type":"ContainerStarted","Data":"fe333ebbc3692453fd833ac8008aa6e834f1b9b3d21331ba303c908f4850d85b"} Oct 03 15:17:42 crc kubenswrapper[4636]: I1003 15:17:42.079846 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_efbece4d-3b40-41b8-819a-9dac3cf42b21/nova-cell0-conductor-conductor/0.log" Oct 03 15:17:42 crc kubenswrapper[4636]: I1003 15:17:42.727932 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8c04651e-c4ab-4322-ae46-6ee8a115ed64/nova-api-log/0.log" Oct 03 15:17:42 crc kubenswrapper[4636]: I1003 15:17:42.731577 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d/nova-cell1-conductor-conductor/0.log" Oct 03 15:17:43 crc kubenswrapper[4636]: I1003 15:17:43.076762 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8c04651e-c4ab-4322-ae46-6ee8a115ed64/nova-api-api/0.log" Oct 03 15:17:43 crc kubenswrapper[4636]: I1003 15:17:43.140849 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_68498e63-11ab-4746-ae7f-01662c1e136f/nova-cell1-novncproxy-novncproxy/0.log" Oct 03 15:17:43 crc kubenswrapper[4636]: I1003 15:17:43.419486 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-ggwhp_ee4e092c-de87-4547-a39a-1a451ef9dc64/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:43 crc kubenswrapper[4636]: I1003 15:17:43.622951 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4c9bc86e-3770-40e9-bf37-80627278032b/nova-metadata-log/0.log" Oct 03 15:17:43 crc kubenswrapper[4636]: I1003 15:17:43.788976 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrzrx" event={"ID":"37e2d0a1-58e7-487a-841a-53f45820362d","Type":"ContainerStarted","Data":"488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d"} Oct 03 15:17:44 crc kubenswrapper[4636]: I1003 15:17:44.449912 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b3439f9c-0086-413d-a84f-79e7da2ffcbd/mysql-bootstrap/0.log" Oct 03 15:17:44 crc kubenswrapper[4636]: I1003 15:17:44.539149 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_493bf5be-a62b-4d5e-8de8-082ab7d23842/nova-scheduler-scheduler/0.log" Oct 03 15:17:44 crc kubenswrapper[4636]: I1003 15:17:44.651286 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b3439f9c-0086-413d-a84f-79e7da2ffcbd/mysql-bootstrap/0.log" Oct 03 15:17:44 crc kubenswrapper[4636]: I1003 15:17:44.806821 4636 generic.go:334] "Generic (PLEG): container finished" podID="37e2d0a1-58e7-487a-841a-53f45820362d" containerID="488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d" exitCode=0 Oct 03 15:17:44 crc kubenswrapper[4636]: I1003 15:17:44.808560 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrzrx" event={"ID":"37e2d0a1-58e7-487a-841a-53f45820362d","Type":"ContainerDied","Data":"488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d"} Oct 03 15:17:44 crc kubenswrapper[4636]: I1003 15:17:44.822690 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b3439f9c-0086-413d-a84f-79e7da2ffcbd/galera/0.log" Oct 03 15:17:44 crc kubenswrapper[4636]: E1003 15:17:44.841869 4636 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37e2d0a1_58e7_487a_841a_53f45820362d.slice/crio-488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d.scope\": RecentStats: unable to find data in memory cache]" Oct 03 15:17:45 crc kubenswrapper[4636]: I1003 15:17:45.158698 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_781432ad-b393-4271-8a8a-39254e422cd4/mysql-bootstrap/0.log" Oct 03 15:17:45 crc kubenswrapper[4636]: I1003 15:17:45.369691 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_781432ad-b393-4271-8a8a-39254e422cd4/mysql-bootstrap/0.log" Oct 03 15:17:45 crc kubenswrapper[4636]: I1003 15:17:45.548344 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_781432ad-b393-4271-8a8a-39254e422cd4/galera/0.log" Oct 03 15:17:45 crc kubenswrapper[4636]: I1003 15:17:45.773631 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0a7aa438-f4f0-4975-a0e8-1005b56f8957/openstackclient/0.log" Oct 03 15:17:45 crc kubenswrapper[4636]: I1003 15:17:45.783404 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4c9bc86e-3770-40e9-bf37-80627278032b/nova-metadata-metadata/0.log" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.002923 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-2mfj2_62646db9-d39c-4cb1-b308-22dff51e4bcf/ovn-controller/0.log" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.176182 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nhttt"] Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.178694 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.205636 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhttt"] Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.284108 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-catalog-content\") pod \"redhat-operators-nhttt\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.284207 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-utilities\") pod \"redhat-operators-nhttt\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.284266 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjqhv\" (UniqueName: \"kubernetes.io/projected/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-kube-api-access-hjqhv\") pod \"redhat-operators-nhttt\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.337393 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4f5ff_291d0189-08a0-4b8b-8406-8601de0e3708/openstack-network-exporter/0.log" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.385567 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-catalog-content\") pod \"redhat-operators-nhttt\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.385662 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-utilities\") pod \"redhat-operators-nhttt\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.385708 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjqhv\" (UniqueName: \"kubernetes.io/projected/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-kube-api-access-hjqhv\") pod \"redhat-operators-nhttt\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.386037 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-catalog-content\") pod \"redhat-operators-nhttt\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.386387 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-utilities\") pod \"redhat-operators-nhttt\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.433131 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjqhv\" (UniqueName: \"kubernetes.io/projected/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-kube-api-access-hjqhv\") pod \"redhat-operators-nhttt\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.462312 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2pfz4_fc054158-e506-4945-b3da-50265dc1b1aa/ovsdb-server-init/0.log" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.500764 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.929789 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrzrx" event={"ID":"37e2d0a1-58e7-487a-841a-53f45820362d","Type":"ContainerStarted","Data":"02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac"} Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.958498 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xrzrx" podStartSLOduration=4.019061912 podStartE2EDuration="7.958482773s" podCreationTimestamp="2025-10-03 15:17:39 +0000 UTC" firstStartedPulling="2025-10-03 15:17:41.76987928 +0000 UTC m=+4611.628605527" lastFinishedPulling="2025-10-03 15:17:45.709300141 +0000 UTC m=+4615.568026388" observedRunningTime="2025-10-03 15:17:46.956258304 +0000 UTC m=+4616.814984551" watchObservedRunningTime="2025-10-03 15:17:46.958482773 +0000 UTC m=+4616.817209020" Oct 03 15:17:46 crc kubenswrapper[4636]: I1003 15:17:46.979228 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2pfz4_fc054158-e506-4945-b3da-50265dc1b1aa/ovs-vswitchd/0.log" Oct 03 15:17:47 crc kubenswrapper[4636]: I1003 15:17:47.010644 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhttt"] Oct 03 15:17:47 crc kubenswrapper[4636]: I1003 15:17:47.072202 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2pfz4_fc054158-e506-4945-b3da-50265dc1b1aa/ovsdb-server/0.log" Oct 03 15:17:47 crc kubenswrapper[4636]: I1003 15:17:47.148057 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2pfz4_fc054158-e506-4945-b3da-50265dc1b1aa/ovsdb-server-init/0.log" Oct 03 15:17:47 crc kubenswrapper[4636]: I1003 15:17:47.794132 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:17:47 crc kubenswrapper[4636]: I1003 15:17:47.969681 4636 generic.go:334] "Generic (PLEG): container finished" podID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerID="883bd44b74917dc8793969c61f735a6fb4e307d4d28579ef76c66cedc99978be" exitCode=0 Oct 03 15:17:47 crc kubenswrapper[4636]: I1003 15:17:47.970767 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhttt" event={"ID":"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8","Type":"ContainerDied","Data":"883bd44b74917dc8793969c61f735a6fb4e307d4d28579ef76c66cedc99978be"} Oct 03 15:17:47 crc kubenswrapper[4636]: I1003 15:17:47.970794 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhttt" event={"ID":"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8","Type":"ContainerStarted","Data":"db77ac002709ad299527fde1a7b0bd5b88a3a7d3ac329b3f78ee4df472051690"} Oct 03 15:17:47 crc kubenswrapper[4636]: I1003 15:17:47.973461 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:17:48 crc kubenswrapper[4636]: I1003 15:17:48.111070 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-frzc6_d1e8fa7f-c140-4196-8967-ca303b35e8c5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:48 crc kubenswrapper[4636]: I1003 15:17:48.444404 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9017beb0-a89a-4efa-b304-ee0ab7a8ce54/openstack-network-exporter/0.log" Oct 03 15:17:48 crc kubenswrapper[4636]: I1003 15:17:48.474945 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9017beb0-a89a-4efa-b304-ee0ab7a8ce54/ovn-northd/0.log" Oct 03 15:17:48 crc kubenswrapper[4636]: I1003 15:17:48.982072 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"913c176d23c8c8c6a552be6bf2fd2627170ae4a3ef1ef4abe575bb231ee8bb69"} Oct 03 15:17:49 crc kubenswrapper[4636]: I1003 15:17:49.103953 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_af99ddda-1ae6-4b70-9422-06c99e8664e5/openstack-network-exporter/0.log" Oct 03 15:17:49 crc kubenswrapper[4636]: I1003 15:17:49.460290 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_af99ddda-1ae6-4b70-9422-06c99e8664e5/ovsdbserver-nb/0.log" Oct 03 15:17:49 crc kubenswrapper[4636]: I1003 15:17:49.711066 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2a4510e7-aa39-4e1f-80bb-196127d2643c/openstack-network-exporter/0.log" Oct 03 15:17:49 crc kubenswrapper[4636]: I1003 15:17:49.809714 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2a4510e7-aa39-4e1f-80bb-196127d2643c/ovsdbserver-sb/0.log" Oct 03 15:17:50 crc kubenswrapper[4636]: I1003 15:17:50.025775 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhttt" event={"ID":"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8","Type":"ContainerStarted","Data":"8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7"} Oct 03 15:17:50 crc kubenswrapper[4636]: I1003 15:17:50.417466 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6796cf444-9xs6c_7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2/placement-api/0.log" Oct 03 15:17:50 crc kubenswrapper[4636]: I1003 15:17:50.451671 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:50 crc kubenswrapper[4636]: I1003 15:17:50.451719 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:17:50 crc kubenswrapper[4636]: I1003 15:17:50.488032 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e97eeb5a-f169-4c58-bda2-c727ca1f5126/setup-container/0.log" Oct 03 15:17:50 crc kubenswrapper[4636]: I1003 15:17:50.585212 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6796cf444-9xs6c_7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2/placement-log/0.log" Oct 03 15:17:50 crc kubenswrapper[4636]: I1003 15:17:50.764442 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e97eeb5a-f169-4c58-bda2-c727ca1f5126/setup-container/0.log" Oct 03 15:17:50 crc kubenswrapper[4636]: I1003 15:17:50.875918 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e97eeb5a-f169-4c58-bda2-c727ca1f5126/rabbitmq/0.log" Oct 03 15:17:51 crc kubenswrapper[4636]: I1003 15:17:51.010383 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f7c3cb64-6553-4d95-8ccc-25f758b3cc97/setup-container/0.log" Oct 03 15:17:51 crc kubenswrapper[4636]: I1003 15:17:51.277642 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f7c3cb64-6553-4d95-8ccc-25f758b3cc97/setup-container/0.log" Oct 03 15:17:51 crc kubenswrapper[4636]: I1003 15:17:51.438631 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f7c3cb64-6553-4d95-8ccc-25f758b3cc97/rabbitmq/0.log" Oct 03 15:17:51 crc kubenswrapper[4636]: I1003 15:17:51.528392 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xrzrx" podUID="37e2d0a1-58e7-487a-841a-53f45820362d" containerName="registry-server" probeResult="failure" output=< Oct 03 15:17:51 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 15:17:51 crc kubenswrapper[4636]: > Oct 03 15:17:51 crc kubenswrapper[4636]: I1003 15:17:51.578874 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9_0057d92e-1564-4b8e-93e9-aee9f862501e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:51 crc kubenswrapper[4636]: I1003 15:17:51.796928 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-4vk2h_e7ae7cb3-1588-4c70-92e2-942cef9d9b0a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:52 crc kubenswrapper[4636]: I1003 15:17:52.038688 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw_9eb85b02-3bf8-4fe8-a060-c3593e995499/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:52 crc kubenswrapper[4636]: I1003 15:17:52.504530 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-tlxw2_1af273b7-459c-4175-9085-28fa11fb76ee/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:52 crc kubenswrapper[4636]: I1003 15:17:52.581488 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-f24zs_4fa2c95f-4798-46d0-8e21-31334d585714/ssh-known-hosts-edpm-deployment/0.log" Oct 03 15:17:52 crc kubenswrapper[4636]: I1003 15:17:52.949425 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6766dbb747-7j5j7_e0a3acac-6d5f-49d7-9b2e-52bd155fb674/proxy-server/0.log" Oct 03 15:17:53 crc kubenswrapper[4636]: I1003 15:17:53.013937 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6766dbb747-7j5j7_e0a3acac-6d5f-49d7-9b2e-52bd155fb674/proxy-httpd/0.log" Oct 03 15:17:53 crc kubenswrapper[4636]: I1003 15:17:53.369203 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qz8ds_00eeeec0-4e4a-4e2c-aaa6-07a793372fd7/swift-ring-rebalance/0.log" Oct 03 15:17:53 crc kubenswrapper[4636]: I1003 15:17:53.563831 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/account-auditor/0.log" Oct 03 15:17:53 crc kubenswrapper[4636]: I1003 15:17:53.602810 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/account-reaper/0.log" Oct 03 15:17:53 crc kubenswrapper[4636]: I1003 15:17:53.782941 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/account-replicator/0.log" Oct 03 15:17:53 crc kubenswrapper[4636]: I1003 15:17:53.922951 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/account-server/0.log" Oct 03 15:17:53 crc kubenswrapper[4636]: I1003 15:17:53.931611 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/container-auditor/0.log" Oct 03 15:17:54 crc kubenswrapper[4636]: I1003 15:17:54.086943 4636 generic.go:334] "Generic (PLEG): container finished" podID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerID="8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7" exitCode=0 Oct 03 15:17:54 crc kubenswrapper[4636]: I1003 15:17:54.086988 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhttt" event={"ID":"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8","Type":"ContainerDied","Data":"8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7"} Oct 03 15:17:54 crc kubenswrapper[4636]: I1003 15:17:54.125965 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/container-replicator/0.log" Oct 03 15:17:54 crc kubenswrapper[4636]: I1003 15:17:54.205853 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/container-server/0.log" Oct 03 15:17:54 crc kubenswrapper[4636]: I1003 15:17:54.265322 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/container-updater/0.log" Oct 03 15:17:54 crc kubenswrapper[4636]: I1003 15:17:54.441223 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/object-auditor/0.log" Oct 03 15:17:54 crc kubenswrapper[4636]: I1003 15:17:54.494405 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/object-expirer/0.log" Oct 03 15:17:54 crc kubenswrapper[4636]: I1003 15:17:54.527397 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/object-replicator/0.log" Oct 03 15:17:54 crc kubenswrapper[4636]: I1003 15:17:54.849049 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/object-server/0.log" Oct 03 15:17:54 crc kubenswrapper[4636]: I1003 15:17:54.864560 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/rsync/0.log" Oct 03 15:17:54 crc kubenswrapper[4636]: I1003 15:17:54.881877 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/object-updater/0.log" Oct 03 15:17:55 crc kubenswrapper[4636]: I1003 15:17:55.103893 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhttt" event={"ID":"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8","Type":"ContainerStarted","Data":"babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195"} Oct 03 15:17:55 crc kubenswrapper[4636]: I1003 15:17:55.134240 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nhttt" podStartSLOduration=2.593671755 podStartE2EDuration="9.134188967s" podCreationTimestamp="2025-10-03 15:17:46 +0000 UTC" firstStartedPulling="2025-10-03 15:17:47.973036203 +0000 UTC m=+4617.831762450" lastFinishedPulling="2025-10-03 15:17:54.513553405 +0000 UTC m=+4624.372279662" observedRunningTime="2025-10-03 15:17:55.132689488 +0000 UTC m=+4624.991415755" watchObservedRunningTime="2025-10-03 15:17:55.134188967 +0000 UTC m=+4624.992915214" Oct 03 15:17:55 crc kubenswrapper[4636]: I1003 15:17:55.319268 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4njmj_88e3290e-0c0d-4304-bcd2-b500068dc443/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:55 crc kubenswrapper[4636]: I1003 15:17:55.364215 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/swift-recon-cron/0.log" Oct 03 15:17:55 crc kubenswrapper[4636]: I1003 15:17:55.652774 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_76d391b3-cee3-4591-814b-a1b99bed1872/tempest-tests-tempest-tests-runner/0.log" Oct 03 15:17:55 crc kubenswrapper[4636]: I1003 15:17:55.944924 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d141c423-495c-4fa0-af39-06bd5c484253/test-operator-logs-container/0.log" Oct 03 15:17:56 crc kubenswrapper[4636]: I1003 15:17:56.233001 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-2jt64_baf6dabc-cac4-4e7c-9101-dcd5cfe39647/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:17:56 crc kubenswrapper[4636]: I1003 15:17:56.501544 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:56 crc kubenswrapper[4636]: I1003 15:17:56.501786 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:17:57 crc kubenswrapper[4636]: I1003 15:17:57.654519 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nhttt" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="registry-server" probeResult="failure" output=< Oct 03 15:17:57 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 15:17:57 crc kubenswrapper[4636]: > Oct 03 15:18:01 crc kubenswrapper[4636]: I1003 15:18:01.366010 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_17e09844-cd33-42a1-a0dc-e1995b872663/memcached/0.log" Oct 03 15:18:01 crc kubenswrapper[4636]: I1003 15:18:01.524557 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xrzrx" podUID="37e2d0a1-58e7-487a-841a-53f45820362d" containerName="registry-server" probeResult="failure" output=< Oct 03 15:18:01 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 15:18:01 crc kubenswrapper[4636]: > Oct 03 15:18:08 crc kubenswrapper[4636]: I1003 15:18:08.027883 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nhttt" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="registry-server" probeResult="failure" output=< Oct 03 15:18:08 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 15:18:08 crc kubenswrapper[4636]: > Oct 03 15:18:10 crc kubenswrapper[4636]: I1003 15:18:10.498030 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:18:10 crc kubenswrapper[4636]: I1003 15:18:10.559835 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:18:11 crc kubenswrapper[4636]: I1003 15:18:11.044728 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xrzrx"] Oct 03 15:18:12 crc kubenswrapper[4636]: I1003 15:18:12.264645 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xrzrx" podUID="37e2d0a1-58e7-487a-841a-53f45820362d" containerName="registry-server" containerID="cri-o://02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac" gracePeriod=2 Oct 03 15:18:12 crc kubenswrapper[4636]: I1003 15:18:12.917286 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.089752 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9h4f\" (UniqueName: \"kubernetes.io/projected/37e2d0a1-58e7-487a-841a-53f45820362d-kube-api-access-w9h4f\") pod \"37e2d0a1-58e7-487a-841a-53f45820362d\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.089889 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-catalog-content\") pod \"37e2d0a1-58e7-487a-841a-53f45820362d\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.089996 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-utilities\") pod \"37e2d0a1-58e7-487a-841a-53f45820362d\" (UID: \"37e2d0a1-58e7-487a-841a-53f45820362d\") " Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.090656 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-utilities" (OuterVolumeSpecName: "utilities") pod "37e2d0a1-58e7-487a-841a-53f45820362d" (UID: "37e2d0a1-58e7-487a-841a-53f45820362d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.096246 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e2d0a1-58e7-487a-841a-53f45820362d-kube-api-access-w9h4f" (OuterVolumeSpecName: "kube-api-access-w9h4f") pod "37e2d0a1-58e7-487a-841a-53f45820362d" (UID: "37e2d0a1-58e7-487a-841a-53f45820362d"). InnerVolumeSpecName "kube-api-access-w9h4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.148387 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37e2d0a1-58e7-487a-841a-53f45820362d" (UID: "37e2d0a1-58e7-487a-841a-53f45820362d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.191802 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.192074 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9h4f\" (UniqueName: \"kubernetes.io/projected/37e2d0a1-58e7-487a-841a-53f45820362d-kube-api-access-w9h4f\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.192086 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37e2d0a1-58e7-487a-841a-53f45820362d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.273301 4636 generic.go:334] "Generic (PLEG): container finished" podID="37e2d0a1-58e7-487a-841a-53f45820362d" containerID="02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac" exitCode=0 Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.273491 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrzrx" event={"ID":"37e2d0a1-58e7-487a-841a-53f45820362d","Type":"ContainerDied","Data":"02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac"} Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.274421 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xrzrx" event={"ID":"37e2d0a1-58e7-487a-841a-53f45820362d","Type":"ContainerDied","Data":"fe333ebbc3692453fd833ac8008aa6e834f1b9b3d21331ba303c908f4850d85b"} Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.274498 4636 scope.go:117] "RemoveContainer" containerID="02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.273555 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xrzrx" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.295404 4636 scope.go:117] "RemoveContainer" containerID="488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.318148 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xrzrx"] Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.327806 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xrzrx"] Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.815981 4636 scope.go:117] "RemoveContainer" containerID="ec894dc15539324bb37792a1561adead07c0254abba7ab38387135cf83bef38f" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.869716 4636 scope.go:117] "RemoveContainer" containerID="02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac" Oct 03 15:18:13 crc kubenswrapper[4636]: E1003 15:18:13.870485 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac\": container with ID starting with 02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac not found: ID does not exist" containerID="02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.870515 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac"} err="failed to get container status \"02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac\": rpc error: code = NotFound desc = could not find container \"02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac\": container with ID starting with 02fd170cdd376e11e486ea621244469cd39cd672afdf4bf6cd259683a2a4d7ac not found: ID does not exist" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.870537 4636 scope.go:117] "RemoveContainer" containerID="488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d" Oct 03 15:18:13 crc kubenswrapper[4636]: E1003 15:18:13.870791 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d\": container with ID starting with 488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d not found: ID does not exist" containerID="488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.870814 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d"} err="failed to get container status \"488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d\": rpc error: code = NotFound desc = could not find container \"488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d\": container with ID starting with 488d1775a8fbf9464a16cffb40fbbd5d4351c989892e2bc2360607a9f7b4009d not found: ID does not exist" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.870826 4636 scope.go:117] "RemoveContainer" containerID="ec894dc15539324bb37792a1561adead07c0254abba7ab38387135cf83bef38f" Oct 03 15:18:13 crc kubenswrapper[4636]: E1003 15:18:13.871078 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec894dc15539324bb37792a1561adead07c0254abba7ab38387135cf83bef38f\": container with ID starting with ec894dc15539324bb37792a1561adead07c0254abba7ab38387135cf83bef38f not found: ID does not exist" containerID="ec894dc15539324bb37792a1561adead07c0254abba7ab38387135cf83bef38f" Oct 03 15:18:13 crc kubenswrapper[4636]: I1003 15:18:13.871234 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec894dc15539324bb37792a1561adead07c0254abba7ab38387135cf83bef38f"} err="failed to get container status \"ec894dc15539324bb37792a1561adead07c0254abba7ab38387135cf83bef38f\": rpc error: code = NotFound desc = could not find container \"ec894dc15539324bb37792a1561adead07c0254abba7ab38387135cf83bef38f\": container with ID starting with ec894dc15539324bb37792a1561adead07c0254abba7ab38387135cf83bef38f not found: ID does not exist" Oct 03 15:18:14 crc kubenswrapper[4636]: I1003 15:18:14.831111 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e2d0a1-58e7-487a-841a-53f45820362d" path="/var/lib/kubelet/pods/37e2d0a1-58e7-487a-841a-53f45820362d/volumes" Oct 03 15:18:17 crc kubenswrapper[4636]: I1003 15:18:17.563824 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nhttt" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="registry-server" probeResult="failure" output=< Oct 03 15:18:17 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 15:18:17 crc kubenswrapper[4636]: > Oct 03 15:18:27 crc kubenswrapper[4636]: I1003 15:18:27.549345 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nhttt" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="registry-server" probeResult="failure" output=< Oct 03 15:18:27 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 15:18:27 crc kubenswrapper[4636]: > Oct 03 15:18:36 crc kubenswrapper[4636]: I1003 15:18:36.546426 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:18:36 crc kubenswrapper[4636]: I1003 15:18:36.596298 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:18:36 crc kubenswrapper[4636]: I1003 15:18:36.788368 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nhttt"] Oct 03 15:18:38 crc kubenswrapper[4636]: I1003 15:18:38.506263 4636 generic.go:334] "Generic (PLEG): container finished" podID="450fc841-5bdf-47f3-8c20-6b70396e445f" containerID="3e567e1edf4c7ea08dbdc8f68993bf0900be4dd3b9ce68e257ebf3f3bc99c27f" exitCode=0 Oct 03 15:18:38 crc kubenswrapper[4636]: I1003 15:18:38.506771 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nhttt" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="registry-server" containerID="cri-o://babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195" gracePeriod=2 Oct 03 15:18:38 crc kubenswrapper[4636]: I1003 15:18:38.507150 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/crc-debug-72bnq" event={"ID":"450fc841-5bdf-47f3-8c20-6b70396e445f","Type":"ContainerDied","Data":"3e567e1edf4c7ea08dbdc8f68993bf0900be4dd3b9ce68e257ebf3f3bc99c27f"} Oct 03 15:18:38 crc kubenswrapper[4636]: I1003 15:18:38.958301 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.100855 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-utilities\") pod \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.100930 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjqhv\" (UniqueName: \"kubernetes.io/projected/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-kube-api-access-hjqhv\") pod \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.101015 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-catalog-content\") pod \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\" (UID: \"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8\") " Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.101802 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-utilities" (OuterVolumeSpecName: "utilities") pod "e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" (UID: "e2ba6ce8-af8f-4027-9d19-06af3a06c4c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.106596 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-kube-api-access-hjqhv" (OuterVolumeSpecName: "kube-api-access-hjqhv") pod "e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" (UID: "e2ba6ce8-af8f-4027-9d19-06af3a06c4c8"). InnerVolumeSpecName "kube-api-access-hjqhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.194307 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" (UID: "e2ba6ce8-af8f-4027-9d19-06af3a06c4c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.203916 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.203948 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjqhv\" (UniqueName: \"kubernetes.io/projected/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-kube-api-access-hjqhv\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.203959 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.516422 4636 generic.go:334] "Generic (PLEG): container finished" podID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerID="babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195" exitCode=0 Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.516501 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhttt" event={"ID":"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8","Type":"ContainerDied","Data":"babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195"} Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.516561 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhttt" event={"ID":"e2ba6ce8-af8f-4027-9d19-06af3a06c4c8","Type":"ContainerDied","Data":"db77ac002709ad299527fde1a7b0bd5b88a3a7d3ac329b3f78ee4df472051690"} Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.516583 4636 scope.go:117] "RemoveContainer" containerID="babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.517237 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhttt" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.573733 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-72bnq" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.576791 4636 scope.go:117] "RemoveContainer" containerID="8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.594082 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nhttt"] Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.606944 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nhttt"] Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.613011 4636 scope.go:117] "RemoveContainer" containerID="883bd44b74917dc8793969c61f735a6fb4e307d4d28579ef76c66cedc99978be" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.613338 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/450fc841-5bdf-47f3-8c20-6b70396e445f-host\") pod \"450fc841-5bdf-47f3-8c20-6b70396e445f\" (UID: \"450fc841-5bdf-47f3-8c20-6b70396e445f\") " Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.613454 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88d6x\" (UniqueName: \"kubernetes.io/projected/450fc841-5bdf-47f3-8c20-6b70396e445f-kube-api-access-88d6x\") pod \"450fc841-5bdf-47f3-8c20-6b70396e445f\" (UID: \"450fc841-5bdf-47f3-8c20-6b70396e445f\") " Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.614241 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/450fc841-5bdf-47f3-8c20-6b70396e445f-host" (OuterVolumeSpecName: "host") pod "450fc841-5bdf-47f3-8c20-6b70396e445f" (UID: "450fc841-5bdf-47f3-8c20-6b70396e445f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.617832 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/450fc841-5bdf-47f3-8c20-6b70396e445f-kube-api-access-88d6x" (OuterVolumeSpecName: "kube-api-access-88d6x") pod "450fc841-5bdf-47f3-8c20-6b70396e445f" (UID: "450fc841-5bdf-47f3-8c20-6b70396e445f"). InnerVolumeSpecName "kube-api-access-88d6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.635685 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r2hg7/crc-debug-72bnq"] Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.642988 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r2hg7/crc-debug-72bnq"] Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.714771 4636 scope.go:117] "RemoveContainer" containerID="babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195" Oct 03 15:18:39 crc kubenswrapper[4636]: E1003 15:18:39.715346 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195\": container with ID starting with babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195 not found: ID does not exist" containerID="babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.715384 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195"} err="failed to get container status \"babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195\": rpc error: code = NotFound desc = could not find container \"babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195\": container with ID starting with babcd8eee3c4e6771c801819be5980b59ce88e8076c1992810e6a4063a190195 not found: ID does not exist" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.715429 4636 scope.go:117] "RemoveContainer" containerID="8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7" Oct 03 15:18:39 crc kubenswrapper[4636]: E1003 15:18:39.715723 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7\": container with ID starting with 8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7 not found: ID does not exist" containerID="8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.715771 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7"} err="failed to get container status \"8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7\": rpc error: code = NotFound desc = could not find container \"8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7\": container with ID starting with 8f1515b591823f5e8c48651dc38076a20cf3c69713c47a08cf9e8c891aaa70f7 not found: ID does not exist" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.715788 4636 scope.go:117] "RemoveContainer" containerID="883bd44b74917dc8793969c61f735a6fb4e307d4d28579ef76c66cedc99978be" Oct 03 15:18:39 crc kubenswrapper[4636]: E1003 15:18:39.716034 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883bd44b74917dc8793969c61f735a6fb4e307d4d28579ef76c66cedc99978be\": container with ID starting with 883bd44b74917dc8793969c61f735a6fb4e307d4d28579ef76c66cedc99978be not found: ID does not exist" containerID="883bd44b74917dc8793969c61f735a6fb4e307d4d28579ef76c66cedc99978be" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.716060 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883bd44b74917dc8793969c61f735a6fb4e307d4d28579ef76c66cedc99978be"} err="failed to get container status \"883bd44b74917dc8793969c61f735a6fb4e307d4d28579ef76c66cedc99978be\": rpc error: code = NotFound desc = could not find container \"883bd44b74917dc8793969c61f735a6fb4e307d4d28579ef76c66cedc99978be\": container with ID starting with 883bd44b74917dc8793969c61f735a6fb4e307d4d28579ef76c66cedc99978be not found: ID does not exist" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.717265 4636 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/450fc841-5bdf-47f3-8c20-6b70396e445f-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:39 crc kubenswrapper[4636]: I1003 15:18:39.717292 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88d6x\" (UniqueName: \"kubernetes.io/projected/450fc841-5bdf-47f3-8c20-6b70396e445f-kube-api-access-88d6x\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.531772 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ec3f0c460d61ef8424a489909e1e2b53267b7d972b94827ed02ef1841ad081c" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.531857 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-72bnq" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.775593 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r2hg7/crc-debug-4vxgn"] Oct 03 15:18:40 crc kubenswrapper[4636]: E1003 15:18:40.776287 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="extract-utilities" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.776305 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="extract-utilities" Oct 03 15:18:40 crc kubenswrapper[4636]: E1003 15:18:40.776323 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="registry-server" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.776330 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="registry-server" Oct 03 15:18:40 crc kubenswrapper[4636]: E1003 15:18:40.776344 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="extract-content" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.776352 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="extract-content" Oct 03 15:18:40 crc kubenswrapper[4636]: E1003 15:18:40.776360 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e2d0a1-58e7-487a-841a-53f45820362d" containerName="registry-server" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.776365 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e2d0a1-58e7-487a-841a-53f45820362d" containerName="registry-server" Oct 03 15:18:40 crc kubenswrapper[4636]: E1003 15:18:40.776382 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="450fc841-5bdf-47f3-8c20-6b70396e445f" containerName="container-00" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.776390 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="450fc841-5bdf-47f3-8c20-6b70396e445f" containerName="container-00" Oct 03 15:18:40 crc kubenswrapper[4636]: E1003 15:18:40.776405 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e2d0a1-58e7-487a-841a-53f45820362d" containerName="extract-content" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.776410 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e2d0a1-58e7-487a-841a-53f45820362d" containerName="extract-content" Oct 03 15:18:40 crc kubenswrapper[4636]: E1003 15:18:40.776431 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e2d0a1-58e7-487a-841a-53f45820362d" containerName="extract-utilities" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.776437 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e2d0a1-58e7-487a-841a-53f45820362d" containerName="extract-utilities" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.776629 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="450fc841-5bdf-47f3-8c20-6b70396e445f" containerName="container-00" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.776644 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" containerName="registry-server" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.776654 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e2d0a1-58e7-487a-841a-53f45820362d" containerName="registry-server" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.777254 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.813243 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="450fc841-5bdf-47f3-8c20-6b70396e445f" path="/var/lib/kubelet/pods/450fc841-5bdf-47f3-8c20-6b70396e445f/volumes" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.814343 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ba6ce8-af8f-4027-9d19-06af3a06c4c8" path="/var/lib/kubelet/pods/e2ba6ce8-af8f-4027-9d19-06af3a06c4c8/volumes" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.841828 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtcjf\" (UniqueName: \"kubernetes.io/projected/e779d42f-c366-4896-95bb-139bad755e13-kube-api-access-jtcjf\") pod \"crc-debug-4vxgn\" (UID: \"e779d42f-c366-4896-95bb-139bad755e13\") " pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.841930 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e779d42f-c366-4896-95bb-139bad755e13-host\") pod \"crc-debug-4vxgn\" (UID: \"e779d42f-c366-4896-95bb-139bad755e13\") " pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.944415 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtcjf\" (UniqueName: \"kubernetes.io/projected/e779d42f-c366-4896-95bb-139bad755e13-kube-api-access-jtcjf\") pod \"crc-debug-4vxgn\" (UID: \"e779d42f-c366-4896-95bb-139bad755e13\") " pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.944475 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e779d42f-c366-4896-95bb-139bad755e13-host\") pod \"crc-debug-4vxgn\" (UID: \"e779d42f-c366-4896-95bb-139bad755e13\") " pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" Oct 03 15:18:40 crc kubenswrapper[4636]: I1003 15:18:40.944633 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e779d42f-c366-4896-95bb-139bad755e13-host\") pod \"crc-debug-4vxgn\" (UID: \"e779d42f-c366-4896-95bb-139bad755e13\") " pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" Oct 03 15:18:41 crc kubenswrapper[4636]: I1003 15:18:41.290240 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtcjf\" (UniqueName: \"kubernetes.io/projected/e779d42f-c366-4896-95bb-139bad755e13-kube-api-access-jtcjf\") pod \"crc-debug-4vxgn\" (UID: \"e779d42f-c366-4896-95bb-139bad755e13\") " pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" Oct 03 15:18:41 crc kubenswrapper[4636]: I1003 15:18:41.395536 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" Oct 03 15:18:41 crc kubenswrapper[4636]: I1003 15:18:41.546119 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" event={"ID":"e779d42f-c366-4896-95bb-139bad755e13","Type":"ContainerStarted","Data":"fe617c090d01c0f245a5b21b88da5501f53646710997723c07b44a7e6d893cfb"} Oct 03 15:18:42 crc kubenswrapper[4636]: I1003 15:18:42.556846 4636 generic.go:334] "Generic (PLEG): container finished" podID="e779d42f-c366-4896-95bb-139bad755e13" containerID="377ad8605e112984615ac4c1c7e93bd3cd68dcdb20bf7cc5f4982dea7eb3ec29" exitCode=0 Oct 03 15:18:42 crc kubenswrapper[4636]: I1003 15:18:42.556942 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" event={"ID":"e779d42f-c366-4896-95bb-139bad755e13","Type":"ContainerDied","Data":"377ad8605e112984615ac4c1c7e93bd3cd68dcdb20bf7cc5f4982dea7eb3ec29"} Oct 03 15:18:43 crc kubenswrapper[4636]: I1003 15:18:43.677482 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" Oct 03 15:18:43 crc kubenswrapper[4636]: I1003 15:18:43.799511 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtcjf\" (UniqueName: \"kubernetes.io/projected/e779d42f-c366-4896-95bb-139bad755e13-kube-api-access-jtcjf\") pod \"e779d42f-c366-4896-95bb-139bad755e13\" (UID: \"e779d42f-c366-4896-95bb-139bad755e13\") " Oct 03 15:18:43 crc kubenswrapper[4636]: I1003 15:18:43.799845 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e779d42f-c366-4896-95bb-139bad755e13-host\") pod \"e779d42f-c366-4896-95bb-139bad755e13\" (UID: \"e779d42f-c366-4896-95bb-139bad755e13\") " Oct 03 15:18:43 crc kubenswrapper[4636]: I1003 15:18:43.800340 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e779d42f-c366-4896-95bb-139bad755e13-host" (OuterVolumeSpecName: "host") pod "e779d42f-c366-4896-95bb-139bad755e13" (UID: "e779d42f-c366-4896-95bb-139bad755e13"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:18:43 crc kubenswrapper[4636]: I1003 15:18:43.815491 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e779d42f-c366-4896-95bb-139bad755e13-kube-api-access-jtcjf" (OuterVolumeSpecName: "kube-api-access-jtcjf") pod "e779d42f-c366-4896-95bb-139bad755e13" (UID: "e779d42f-c366-4896-95bb-139bad755e13"). InnerVolumeSpecName "kube-api-access-jtcjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:18:43 crc kubenswrapper[4636]: I1003 15:18:43.902194 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtcjf\" (UniqueName: \"kubernetes.io/projected/e779d42f-c366-4896-95bb-139bad755e13-kube-api-access-jtcjf\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:43 crc kubenswrapper[4636]: I1003 15:18:43.902225 4636 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e779d42f-c366-4896-95bb-139bad755e13-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:44 crc kubenswrapper[4636]: I1003 15:18:44.573997 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" event={"ID":"e779d42f-c366-4896-95bb-139bad755e13","Type":"ContainerDied","Data":"fe617c090d01c0f245a5b21b88da5501f53646710997723c07b44a7e6d893cfb"} Oct 03 15:18:44 crc kubenswrapper[4636]: I1003 15:18:44.574041 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe617c090d01c0f245a5b21b88da5501f53646710997723c07b44a7e6d893cfb" Oct 03 15:18:44 crc kubenswrapper[4636]: I1003 15:18:44.574131 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-4vxgn" Oct 03 15:18:51 crc kubenswrapper[4636]: I1003 15:18:51.306554 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r2hg7/crc-debug-4vxgn"] Oct 03 15:18:51 crc kubenswrapper[4636]: I1003 15:18:51.315253 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r2hg7/crc-debug-4vxgn"] Oct 03 15:18:52 crc kubenswrapper[4636]: I1003 15:18:52.749122 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r2hg7/crc-debug-x6jb4"] Oct 03 15:18:52 crc kubenswrapper[4636]: E1003 15:18:52.749754 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e779d42f-c366-4896-95bb-139bad755e13" containerName="container-00" Oct 03 15:18:52 crc kubenswrapper[4636]: I1003 15:18:52.749768 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="e779d42f-c366-4896-95bb-139bad755e13" containerName="container-00" Oct 03 15:18:52 crc kubenswrapper[4636]: I1003 15:18:52.749940 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="e779d42f-c366-4896-95bb-139bad755e13" containerName="container-00" Oct 03 15:18:52 crc kubenswrapper[4636]: I1003 15:18:52.751049 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" Oct 03 15:18:52 crc kubenswrapper[4636]: I1003 15:18:52.804222 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e779d42f-c366-4896-95bb-139bad755e13" path="/var/lib/kubelet/pods/e779d42f-c366-4896-95bb-139bad755e13/volumes" Oct 03 15:18:52 crc kubenswrapper[4636]: I1003 15:18:52.855385 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjktt\" (UniqueName: \"kubernetes.io/projected/1b9da472-6a5c-40c7-890d-960b548115ca-kube-api-access-pjktt\") pod \"crc-debug-x6jb4\" (UID: \"1b9da472-6a5c-40c7-890d-960b548115ca\") " pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" Oct 03 15:18:52 crc kubenswrapper[4636]: I1003 15:18:52.855527 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b9da472-6a5c-40c7-890d-960b548115ca-host\") pod \"crc-debug-x6jb4\" (UID: \"1b9da472-6a5c-40c7-890d-960b548115ca\") " pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" Oct 03 15:18:52 crc kubenswrapper[4636]: I1003 15:18:52.957598 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b9da472-6a5c-40c7-890d-960b548115ca-host\") pod \"crc-debug-x6jb4\" (UID: \"1b9da472-6a5c-40c7-890d-960b548115ca\") " pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" Oct 03 15:18:52 crc kubenswrapper[4636]: I1003 15:18:52.957721 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjktt\" (UniqueName: \"kubernetes.io/projected/1b9da472-6a5c-40c7-890d-960b548115ca-kube-api-access-pjktt\") pod \"crc-debug-x6jb4\" (UID: \"1b9da472-6a5c-40c7-890d-960b548115ca\") " pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" Oct 03 15:18:52 crc kubenswrapper[4636]: I1003 15:18:52.958152 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b9da472-6a5c-40c7-890d-960b548115ca-host\") pod \"crc-debug-x6jb4\" (UID: \"1b9da472-6a5c-40c7-890d-960b548115ca\") " pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" Oct 03 15:18:52 crc kubenswrapper[4636]: I1003 15:18:52.981530 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjktt\" (UniqueName: \"kubernetes.io/projected/1b9da472-6a5c-40c7-890d-960b548115ca-kube-api-access-pjktt\") pod \"crc-debug-x6jb4\" (UID: \"1b9da472-6a5c-40c7-890d-960b548115ca\") " pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" Oct 03 15:18:53 crc kubenswrapper[4636]: I1003 15:18:53.066090 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" Oct 03 15:18:53 crc kubenswrapper[4636]: I1003 15:18:53.660439 4636 generic.go:334] "Generic (PLEG): container finished" podID="1b9da472-6a5c-40c7-890d-960b548115ca" containerID="1b6488d2216fd366bff04a4e6ce49e05cbb6380776eaa4a01e7b2c83c636a1f6" exitCode=0 Oct 03 15:18:53 crc kubenswrapper[4636]: I1003 15:18:53.660533 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" event={"ID":"1b9da472-6a5c-40c7-890d-960b548115ca","Type":"ContainerDied","Data":"1b6488d2216fd366bff04a4e6ce49e05cbb6380776eaa4a01e7b2c83c636a1f6"} Oct 03 15:18:53 crc kubenswrapper[4636]: I1003 15:18:53.660750 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" event={"ID":"1b9da472-6a5c-40c7-890d-960b548115ca","Type":"ContainerStarted","Data":"41d1fd977bedfe0b45c4f757bd38cd0ec8d049379d180b2d7c4b0fc5f93fa8a4"} Oct 03 15:18:53 crc kubenswrapper[4636]: I1003 15:18:53.699657 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r2hg7/crc-debug-x6jb4"] Oct 03 15:18:53 crc kubenswrapper[4636]: I1003 15:18:53.709572 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r2hg7/crc-debug-x6jb4"] Oct 03 15:18:54 crc kubenswrapper[4636]: I1003 15:18:54.770059 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" Oct 03 15:18:54 crc kubenswrapper[4636]: I1003 15:18:54.890544 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b9da472-6a5c-40c7-890d-960b548115ca-host\") pod \"1b9da472-6a5c-40c7-890d-960b548115ca\" (UID: \"1b9da472-6a5c-40c7-890d-960b548115ca\") " Oct 03 15:18:54 crc kubenswrapper[4636]: I1003 15:18:54.890664 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjktt\" (UniqueName: \"kubernetes.io/projected/1b9da472-6a5c-40c7-890d-960b548115ca-kube-api-access-pjktt\") pod \"1b9da472-6a5c-40c7-890d-960b548115ca\" (UID: \"1b9da472-6a5c-40c7-890d-960b548115ca\") " Oct 03 15:18:54 crc kubenswrapper[4636]: I1003 15:18:54.890655 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b9da472-6a5c-40c7-890d-960b548115ca-host" (OuterVolumeSpecName: "host") pod "1b9da472-6a5c-40c7-890d-960b548115ca" (UID: "1b9da472-6a5c-40c7-890d-960b548115ca"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:18:54 crc kubenswrapper[4636]: I1003 15:18:54.891152 4636 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b9da472-6a5c-40c7-890d-960b548115ca-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:54 crc kubenswrapper[4636]: I1003 15:18:54.903493 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9da472-6a5c-40c7-890d-960b548115ca-kube-api-access-pjktt" (OuterVolumeSpecName: "kube-api-access-pjktt") pod "1b9da472-6a5c-40c7-890d-960b548115ca" (UID: "1b9da472-6a5c-40c7-890d-960b548115ca"). InnerVolumeSpecName "kube-api-access-pjktt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:18:54 crc kubenswrapper[4636]: I1003 15:18:54.993615 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjktt\" (UniqueName: \"kubernetes.io/projected/1b9da472-6a5c-40c7-890d-960b548115ca-kube-api-access-pjktt\") on node \"crc\" DevicePath \"\"" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.227951 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/util/0.log" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.440625 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/pull/0.log" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.465677 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/util/0.log" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.472158 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/pull/0.log" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.631162 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/util/0.log" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.649633 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/extract/0.log" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.668215 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/pull/0.log" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.682207 4636 scope.go:117] "RemoveContainer" containerID="1b6488d2216fd366bff04a4e6ce49e05cbb6380776eaa4a01e7b2c83c636a1f6" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.682281 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/crc-debug-x6jb4" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.851787 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-fh8lr_12b01d5f-b89d-4bf4-bd46-387f2a7ab48f/kube-rbac-proxy/0.log" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.878480 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-z87w6_c8d803e5-9eca-49bf-976a-2acdfc25a727/kube-rbac-proxy/0.log" Oct 03 15:18:55 crc kubenswrapper[4636]: I1003 15:18:55.893428 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-fh8lr_12b01d5f-b89d-4bf4-bd46-387f2a7ab48f/manager/0.log" Oct 03 15:18:56 crc kubenswrapper[4636]: I1003 15:18:56.067572 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-z87w6_c8d803e5-9eca-49bf-976a-2acdfc25a727/manager/0.log" Oct 03 15:18:56 crc kubenswrapper[4636]: I1003 15:18:56.142916 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-x9xms_8002528c-8119-4119-923c-1e15162e63f3/kube-rbac-proxy/0.log" Oct 03 15:18:56 crc kubenswrapper[4636]: I1003 15:18:56.147674 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-x9xms_8002528c-8119-4119-923c-1e15162e63f3/manager/0.log" Oct 03 15:18:56 crc kubenswrapper[4636]: I1003 15:18:56.401442 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-g8m75_3c207da6-bfc7-4287-aa67-56c0097f48f3/kube-rbac-proxy/0.log" Oct 03 15:18:56 crc kubenswrapper[4636]: I1003 15:18:56.460186 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-g8m75_3c207da6-bfc7-4287-aa67-56c0097f48f3/manager/0.log" Oct 03 15:18:56 crc kubenswrapper[4636]: I1003 15:18:56.500906 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-nshvn_eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5/kube-rbac-proxy/0.log" Oct 03 15:18:56 crc kubenswrapper[4636]: I1003 15:18:56.643345 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-nshvn_eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5/manager/0.log" Oct 03 15:18:56 crc kubenswrapper[4636]: I1003 15:18:56.707968 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-dwvpt_24c6a469-5b37-4dc9-baed-6a3c54b11861/manager/0.log" Oct 03 15:18:56 crc kubenswrapper[4636]: I1003 15:18:56.734848 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-dwvpt_24c6a469-5b37-4dc9-baed-6a3c54b11861/kube-rbac-proxy/0.log" Oct 03 15:18:56 crc kubenswrapper[4636]: I1003 15:18:56.816609 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9da472-6a5c-40c7-890d-960b548115ca" path="/var/lib/kubelet/pods/1b9da472-6a5c-40c7-890d-960b548115ca/volumes" Oct 03 15:18:56 crc kubenswrapper[4636]: I1003 15:18:56.910278 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-6h5gc_6e000db3-2d29-4608-9a70-cfe88094a950/kube-rbac-proxy/0.log" Oct 03 15:18:57 crc kubenswrapper[4636]: I1003 15:18:57.118318 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-6h5gc_6e000db3-2d29-4608-9a70-cfe88094a950/manager/0.log" Oct 03 15:18:57 crc kubenswrapper[4636]: I1003 15:18:57.156506 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-p7d6m_d9a0c033-eaea-4336-96e6-9664f726e50e/manager/0.log" Oct 03 15:18:57 crc kubenswrapper[4636]: I1003 15:18:57.190865 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-p7d6m_d9a0c033-eaea-4336-96e6-9664f726e50e/kube-rbac-proxy/0.log" Oct 03 15:18:57 crc kubenswrapper[4636]: I1003 15:18:57.309258 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-jcqmk_62436a9b-229c-486b-a715-6787e100d19b/kube-rbac-proxy/0.log" Oct 03 15:18:57 crc kubenswrapper[4636]: I1003 15:18:57.460264 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-jcqmk_62436a9b-229c-486b-a715-6787e100d19b/manager/0.log" Oct 03 15:18:57 crc kubenswrapper[4636]: I1003 15:18:57.487736 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-8knqr_dff12a21-eff6-45da-bf37-d3f0620f9c05/kube-rbac-proxy/0.log" Oct 03 15:18:57 crc kubenswrapper[4636]: I1003 15:18:57.538602 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-8knqr_dff12a21-eff6-45da-bf37-d3f0620f9c05/manager/0.log" Oct 03 15:18:57 crc kubenswrapper[4636]: I1003 15:18:57.731807 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj_8563d341-44cb-43b4-b7a8-ba3beeac60ea/kube-rbac-proxy/0.log" Oct 03 15:18:57 crc kubenswrapper[4636]: I1003 15:18:57.791470 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj_8563d341-44cb-43b4-b7a8-ba3beeac60ea/manager/0.log" Oct 03 15:18:57 crc kubenswrapper[4636]: I1003 15:18:57.930325 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-9dc5p_89e06d08-9381-4aff-ba52-682080bd03bb/kube-rbac-proxy/0.log" Oct 03 15:18:58 crc kubenswrapper[4636]: I1003 15:18:58.027716 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-9dc5p_89e06d08-9381-4aff-ba52-682080bd03bb/manager/0.log" Oct 03 15:18:58 crc kubenswrapper[4636]: I1003 15:18:58.051730 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-cbxrx_60ec0b38-a07e-46e2-bc94-1af33d301eb6/kube-rbac-proxy/0.log" Oct 03 15:18:58 crc kubenswrapper[4636]: I1003 15:18:58.291739 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-48wqb_2f612d08-a478-46d7-aefd-f31051af25d9/kube-rbac-proxy/0.log" Oct 03 15:18:58 crc kubenswrapper[4636]: I1003 15:18:58.295869 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-48wqb_2f612d08-a478-46d7-aefd-f31051af25d9/manager/0.log" Oct 03 15:18:58 crc kubenswrapper[4636]: I1003 15:18:58.333338 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-cbxrx_60ec0b38-a07e-46e2-bc94-1af33d301eb6/manager/0.log" Oct 03 15:18:58 crc kubenswrapper[4636]: I1003 15:18:58.490641 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj_314cbc97-254d-4e64-a06f-68c7b0488c46/manager/0.log" Oct 03 15:18:58 crc kubenswrapper[4636]: I1003 15:18:58.491484 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj_314cbc97-254d-4e64-a06f-68c7b0488c46/kube-rbac-proxy/0.log" Oct 03 15:18:58 crc kubenswrapper[4636]: I1003 15:18:58.640842 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6dfbbfcbb4-flhg6_a119c810-cd24-4c51-a23b-88776132f825/kube-rbac-proxy/0.log" Oct 03 15:18:58 crc kubenswrapper[4636]: I1003 15:18:58.800667 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-66d65dc5dc-ljjsx_c266feaf-9983-414a-b65e-5a13fc55c419/kube-rbac-proxy/0.log" Oct 03 15:18:59 crc kubenswrapper[4636]: I1003 15:18:59.382937 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-bsvgc_8721857c-625f-4884-bb46-55f9ce071491/registry-server/0.log" Oct 03 15:18:59 crc kubenswrapper[4636]: I1003 15:18:59.464018 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-66d65dc5dc-ljjsx_c266feaf-9983-414a-b65e-5a13fc55c419/operator/0.log" Oct 03 15:18:59 crc kubenswrapper[4636]: I1003 15:18:59.550415 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-8nj2c_f8f9f506-672a-4f93-8645-f0cd608feed0/kube-rbac-proxy/0.log" Oct 03 15:18:59 crc kubenswrapper[4636]: I1003 15:18:59.725996 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-8nj2c_f8f9f506-672a-4f93-8645-f0cd608feed0/manager/0.log" Oct 03 15:18:59 crc kubenswrapper[4636]: I1003 15:18:59.779289 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-lkd7z_ad1290bf-25e9-4766-8398-ff4811e65cad/kube-rbac-proxy/0.log" Oct 03 15:18:59 crc kubenswrapper[4636]: I1003 15:18:59.909270 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-lkd7z_ad1290bf-25e9-4766-8398-ff4811e65cad/manager/0.log" Oct 03 15:19:00 crc kubenswrapper[4636]: I1003 15:19:00.059446 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b_80c4c4f6-4616-48a9-98a7-f38ebdc58514/operator/0.log" Oct 03 15:19:00 crc kubenswrapper[4636]: I1003 15:19:00.144286 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6dfbbfcbb4-flhg6_a119c810-cd24-4c51-a23b-88776132f825/manager/0.log" Oct 03 15:19:00 crc kubenswrapper[4636]: I1003 15:19:00.205563 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-7qrjk_b3cb07c2-c2b9-4421-baba-ede1bed11656/manager/0.log" Oct 03 15:19:00 crc kubenswrapper[4636]: I1003 15:19:00.225061 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-7qrjk_b3cb07c2-c2b9-4421-baba-ede1bed11656/kube-rbac-proxy/0.log" Oct 03 15:19:00 crc kubenswrapper[4636]: I1003 15:19:00.517651 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-8qdtd_126025f8-40af-4a27-a9cc-8ece19d269b0/kube-rbac-proxy/0.log" Oct 03 15:19:00 crc kubenswrapper[4636]: I1003 15:19:00.614340 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-8qdtd_126025f8-40af-4a27-a9cc-8ece19d269b0/manager/0.log" Oct 03 15:19:00 crc kubenswrapper[4636]: I1003 15:19:00.615607 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-lc5hb_58d5890d-301f-43e9-b627-40f17f79da7f/kube-rbac-proxy/0.log" Oct 03 15:19:01 crc kubenswrapper[4636]: I1003 15:19:01.037040 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-xj7dp_24b72852-6d98-4011-9643-5079fa6f8076/kube-rbac-proxy/0.log" Oct 03 15:19:01 crc kubenswrapper[4636]: I1003 15:19:01.056521 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-lc5hb_58d5890d-301f-43e9-b627-40f17f79da7f/manager/0.log" Oct 03 15:19:01 crc kubenswrapper[4636]: I1003 15:19:01.080204 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-xj7dp_24b72852-6d98-4011-9643-5079fa6f8076/manager/0.log" Oct 03 15:19:17 crc kubenswrapper[4636]: I1003 15:19:17.964350 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-r49hv_0499c819-4b67-4882-9354-f7b9d6d2adc7/control-plane-machine-set-operator/0.log" Oct 03 15:19:18 crc kubenswrapper[4636]: I1003 15:19:18.083598 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qzkgg_e697897f-0594-48da-967d-e429421b8fec/kube-rbac-proxy/0.log" Oct 03 15:19:18 crc kubenswrapper[4636]: I1003 15:19:18.126178 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qzkgg_e697897f-0594-48da-967d-e429421b8fec/machine-api-operator/0.log" Oct 03 15:19:30 crc kubenswrapper[4636]: I1003 15:19:30.524090 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-tswd6_be83bffc-d4e8-469a-85d9-6cc8ec6b64f4/cert-manager-controller/0.log" Oct 03 15:19:30 crc kubenswrapper[4636]: I1003 15:19:30.698812 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-lzr2w_2974bed1-bc60-45f9-a4ce-42f14db27998/cert-manager-cainjector/0.log" Oct 03 15:19:30 crc kubenswrapper[4636]: I1003 15:19:30.709446 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-jw6vl_d933c0ac-7ab5-4b2f-9602-5b277d92679e/cert-manager-webhook/0.log" Oct 03 15:19:42 crc kubenswrapper[4636]: I1003 15:19:42.298963 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-zztw8_7bc7eb6e-0aa6-44a5-914e-7f3a97421f50/nmstate-console-plugin/0.log" Oct 03 15:19:42 crc kubenswrapper[4636]: I1003 15:19:42.463452 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mtj6z_89380ab9-db32-4562-aec2-69a9f3c703b6/nmstate-handler/0.log" Oct 03 15:19:42 crc kubenswrapper[4636]: I1003 15:19:42.534391 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vbc2m_438131cc-c24c-40a2-b874-8d1dca095f61/kube-rbac-proxy/0.log" Oct 03 15:19:42 crc kubenswrapper[4636]: I1003 15:19:42.568887 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vbc2m_438131cc-c24c-40a2-b874-8d1dca095f61/nmstate-metrics/0.log" Oct 03 15:19:42 crc kubenswrapper[4636]: I1003 15:19:42.749727 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-7zh6n_a043488f-1ceb-4faa-a72a-76172cf550f7/nmstate-operator/0.log" Oct 03 15:19:42 crc kubenswrapper[4636]: I1003 15:19:42.839645 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-46ppb_dc21bb9c-24c3-4267-ab8d-96ed8e255c69/nmstate-webhook/0.log" Oct 03 15:19:56 crc kubenswrapper[4636]: I1003 15:19:56.616305 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-87w8j_f5c8bfd9-03d0-45ec-825a-d0c8f613c29c/kube-rbac-proxy/0.log" Oct 03 15:19:56 crc kubenswrapper[4636]: I1003 15:19:56.696188 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-87w8j_f5c8bfd9-03d0-45ec-825a-d0c8f613c29c/controller/0.log" Oct 03 15:19:56 crc kubenswrapper[4636]: I1003 15:19:56.803502 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-frr-files/0.log" Oct 03 15:19:57 crc kubenswrapper[4636]: I1003 15:19:57.567642 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-reloader/0.log" Oct 03 15:19:57 crc kubenswrapper[4636]: I1003 15:19:57.629817 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-metrics/0.log" Oct 03 15:19:57 crc kubenswrapper[4636]: I1003 15:19:57.647140 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-frr-files/0.log" Oct 03 15:19:57 crc kubenswrapper[4636]: I1003 15:19:57.686979 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-reloader/0.log" Oct 03 15:19:57 crc kubenswrapper[4636]: I1003 15:19:57.935308 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-reloader/0.log" Oct 03 15:19:57 crc kubenswrapper[4636]: I1003 15:19:57.961134 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-metrics/0.log" Oct 03 15:19:57 crc kubenswrapper[4636]: I1003 15:19:57.978007 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-frr-files/0.log" Oct 03 15:19:58 crc kubenswrapper[4636]: I1003 15:19:58.003775 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-metrics/0.log" Oct 03 15:19:58 crc kubenswrapper[4636]: I1003 15:19:58.155798 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-frr-files/0.log" Oct 03 15:19:58 crc kubenswrapper[4636]: I1003 15:19:58.159852 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-metrics/0.log" Oct 03 15:19:58 crc kubenswrapper[4636]: I1003 15:19:58.247874 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-reloader/0.log" Oct 03 15:19:58 crc kubenswrapper[4636]: I1003 15:19:58.296320 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/controller/0.log" Oct 03 15:19:58 crc kubenswrapper[4636]: I1003 15:19:58.392209 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/frr-metrics/0.log" Oct 03 15:19:58 crc kubenswrapper[4636]: I1003 15:19:58.475931 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/kube-rbac-proxy/0.log" Oct 03 15:19:59 crc kubenswrapper[4636]: I1003 15:19:59.095065 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/kube-rbac-proxy-frr/0.log" Oct 03 15:19:59 crc kubenswrapper[4636]: I1003 15:19:59.126081 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/reloader/0.log" Oct 03 15:19:59 crc kubenswrapper[4636]: I1003 15:19:59.341432 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-ttj4x_360e3dae-23f1-4ddd-9815-d6a41e611501/frr-k8s-webhook-server/0.log" Oct 03 15:19:59 crc kubenswrapper[4636]: I1003 15:19:59.509801 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79b89cf995-qtfsw_4d75cbbf-e22d-49aa-ae40-c77a69421e1a/manager/0.log" Oct 03 15:19:59 crc kubenswrapper[4636]: I1003 15:19:59.697194 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d746fccb7-rtxlz_fdeca3bd-7bca-4463-b480-1b94361da961/webhook-server/0.log" Oct 03 15:19:59 crc kubenswrapper[4636]: I1003 15:19:59.888258 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/frr/0.log" Oct 03 15:19:59 crc kubenswrapper[4636]: I1003 15:19:59.903774 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ggz7j_39a6b95f-24cf-4365-93c0-b47b7a7672fb/kube-rbac-proxy/0.log" Oct 03 15:20:00 crc kubenswrapper[4636]: I1003 15:20:00.394483 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ggz7j_39a6b95f-24cf-4365-93c0-b47b7a7672fb/speaker/0.log" Oct 03 15:20:09 crc kubenswrapper[4636]: I1003 15:20:09.162908 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:20:09 crc kubenswrapper[4636]: I1003 15:20:09.163508 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:20:12 crc kubenswrapper[4636]: I1003 15:20:12.212699 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/util/0.log" Oct 03 15:20:12 crc kubenswrapper[4636]: I1003 15:20:12.441157 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/util/0.log" Oct 03 15:20:12 crc kubenswrapper[4636]: I1003 15:20:12.442694 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/pull/0.log" Oct 03 15:20:12 crc kubenswrapper[4636]: I1003 15:20:12.468972 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/pull/0.log" Oct 03 15:20:12 crc kubenswrapper[4636]: I1003 15:20:12.638695 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/util/0.log" Oct 03 15:20:12 crc kubenswrapper[4636]: I1003 15:20:12.668599 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/pull/0.log" Oct 03 15:20:12 crc kubenswrapper[4636]: I1003 15:20:12.709987 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/extract/0.log" Oct 03 15:20:12 crc kubenswrapper[4636]: I1003 15:20:12.836913 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-utilities/0.log" Oct 03 15:20:13 crc kubenswrapper[4636]: I1003 15:20:13.038521 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-content/0.log" Oct 03 15:20:13 crc kubenswrapper[4636]: I1003 15:20:13.050626 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-utilities/0.log" Oct 03 15:20:13 crc kubenswrapper[4636]: I1003 15:20:13.073983 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-content/0.log" Oct 03 15:20:13 crc kubenswrapper[4636]: I1003 15:20:13.211530 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-utilities/0.log" Oct 03 15:20:13 crc kubenswrapper[4636]: I1003 15:20:13.284328 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-content/0.log" Oct 03 15:20:13 crc kubenswrapper[4636]: I1003 15:20:13.413377 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-utilities/0.log" Oct 03 15:20:13 crc kubenswrapper[4636]: I1003 15:20:13.775692 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-utilities/0.log" Oct 03 15:20:13 crc kubenswrapper[4636]: I1003 15:20:13.835130 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/registry-server/0.log" Oct 03 15:20:13 crc kubenswrapper[4636]: I1003 15:20:13.836230 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-content/0.log" Oct 03 15:20:13 crc kubenswrapper[4636]: I1003 15:20:13.874380 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-content/0.log" Oct 03 15:20:14 crc kubenswrapper[4636]: I1003 15:20:14.035898 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-content/0.log" Oct 03 15:20:14 crc kubenswrapper[4636]: I1003 15:20:14.082236 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-utilities/0.log" Oct 03 15:20:14 crc kubenswrapper[4636]: I1003 15:20:14.283589 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/util/0.log" Oct 03 15:20:14 crc kubenswrapper[4636]: I1003 15:20:14.639251 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/util/0.log" Oct 03 15:20:14 crc kubenswrapper[4636]: I1003 15:20:14.671905 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/pull/0.log" Oct 03 15:20:14 crc kubenswrapper[4636]: I1003 15:20:14.681977 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/pull/0.log" Oct 03 15:20:14 crc kubenswrapper[4636]: I1003 15:20:14.812442 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/registry-server/0.log" Oct 03 15:20:14 crc kubenswrapper[4636]: I1003 15:20:14.982319 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/pull/0.log" Oct 03 15:20:15 crc kubenswrapper[4636]: I1003 15:20:15.004238 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/extract/0.log" Oct 03 15:20:15 crc kubenswrapper[4636]: I1003 15:20:15.007890 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/util/0.log" Oct 03 15:20:15 crc kubenswrapper[4636]: I1003 15:20:15.540389 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ncqfg_eb4639ab-5b3c-4f36-9c1e-077930e571e3/marketplace-operator/0.log" Oct 03 15:20:15 crc kubenswrapper[4636]: I1003 15:20:15.592323 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-utilities/0.log" Oct 03 15:20:15 crc kubenswrapper[4636]: I1003 15:20:15.783843 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-content/0.log" Oct 03 15:20:15 crc kubenswrapper[4636]: I1003 15:20:15.787476 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-content/0.log" Oct 03 15:20:15 crc kubenswrapper[4636]: I1003 15:20:15.793834 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-utilities/0.log" Oct 03 15:20:15 crc kubenswrapper[4636]: I1003 15:20:15.961715 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-utilities/0.log" Oct 03 15:20:16 crc kubenswrapper[4636]: I1003 15:20:16.020331 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-content/0.log" Oct 03 15:20:16 crc kubenswrapper[4636]: I1003 15:20:16.050276 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-utilities/0.log" Oct 03 15:20:16 crc kubenswrapper[4636]: I1003 15:20:16.190666 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-utilities/0.log" Oct 03 15:20:16 crc kubenswrapper[4636]: I1003 15:20:16.198965 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-content/0.log" Oct 03 15:20:16 crc kubenswrapper[4636]: I1003 15:20:16.241392 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-content/0.log" Oct 03 15:20:16 crc kubenswrapper[4636]: I1003 15:20:16.375182 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-content/0.log" Oct 03 15:20:16 crc kubenswrapper[4636]: I1003 15:20:16.459888 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-utilities/0.log" Oct 03 15:20:16 crc kubenswrapper[4636]: I1003 15:20:16.727472 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/registry-server/0.log" Oct 03 15:20:17 crc kubenswrapper[4636]: I1003 15:20:17.009047 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/registry-server/0.log" Oct 03 15:20:39 crc kubenswrapper[4636]: I1003 15:20:39.162960 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:20:39 crc kubenswrapper[4636]: I1003 15:20:39.163497 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:21:09 crc kubenswrapper[4636]: I1003 15:21:09.163149 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:21:09 crc kubenswrapper[4636]: I1003 15:21:09.163789 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:21:09 crc kubenswrapper[4636]: I1003 15:21:09.163867 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 15:21:09 crc kubenswrapper[4636]: I1003 15:21:09.165110 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"913c176d23c8c8c6a552be6bf2fd2627170ae4a3ef1ef4abe575bb231ee8bb69"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:21:09 crc kubenswrapper[4636]: I1003 15:21:09.165209 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://913c176d23c8c8c6a552be6bf2fd2627170ae4a3ef1ef4abe575bb231ee8bb69" gracePeriod=600 Oct 03 15:21:09 crc kubenswrapper[4636]: I1003 15:21:09.811627 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="913c176d23c8c8c6a552be6bf2fd2627170ae4a3ef1ef4abe575bb231ee8bb69" exitCode=0 Oct 03 15:21:09 crc kubenswrapper[4636]: I1003 15:21:09.811971 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"913c176d23c8c8c6a552be6bf2fd2627170ae4a3ef1ef4abe575bb231ee8bb69"} Oct 03 15:21:09 crc kubenswrapper[4636]: I1003 15:21:09.812003 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2"} Oct 03 15:21:09 crc kubenswrapper[4636]: I1003 15:21:09.812022 4636 scope.go:117] "RemoveContainer" containerID="bd2fbfec50d2c7f989a259226c9426f122c471aaec6f2106551358a086abb5cc" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.373492 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6vcq2"] Oct 03 15:22:48 crc kubenswrapper[4636]: E1003 15:22:48.374324 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9da472-6a5c-40c7-890d-960b548115ca" containerName="container-00" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.374336 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9da472-6a5c-40c7-890d-960b548115ca" containerName="container-00" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.374535 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9da472-6a5c-40c7-890d-960b548115ca" containerName="container-00" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.375824 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.391171 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6vcq2"] Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.398562 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-catalog-content\") pod \"community-operators-6vcq2\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.398610 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpk4r\" (UniqueName: \"kubernetes.io/projected/8ba46466-b881-42f5-a24f-d4bdd1880169-kube-api-access-cpk4r\") pod \"community-operators-6vcq2\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.398669 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-utilities\") pod \"community-operators-6vcq2\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.500016 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-catalog-content\") pod \"community-operators-6vcq2\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.500058 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpk4r\" (UniqueName: \"kubernetes.io/projected/8ba46466-b881-42f5-a24f-d4bdd1880169-kube-api-access-cpk4r\") pod \"community-operators-6vcq2\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.500133 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-utilities\") pod \"community-operators-6vcq2\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.500535 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-catalog-content\") pod \"community-operators-6vcq2\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.500568 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-utilities\") pod \"community-operators-6vcq2\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.520038 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpk4r\" (UniqueName: \"kubernetes.io/projected/8ba46466-b881-42f5-a24f-d4bdd1880169-kube-api-access-cpk4r\") pod \"community-operators-6vcq2\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:48 crc kubenswrapper[4636]: I1003 15:22:48.694468 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:49 crc kubenswrapper[4636]: I1003 15:22:49.556621 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6vcq2"] Oct 03 15:22:49 crc kubenswrapper[4636]: I1003 15:22:49.729666 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6vcq2" event={"ID":"8ba46466-b881-42f5-a24f-d4bdd1880169","Type":"ContainerStarted","Data":"2e6670f35afa6f7284c3660585b81510da046dab098e11476c1de623a133c18d"} Oct 03 15:22:50 crc kubenswrapper[4636]: I1003 15:22:50.745341 4636 generic.go:334] "Generic (PLEG): container finished" podID="8ba46466-b881-42f5-a24f-d4bdd1880169" containerID="dc2866559e8105cac7c3baef4bd7858fdc54458c87c23db4bd1984ba08a898da" exitCode=0 Oct 03 15:22:50 crc kubenswrapper[4636]: I1003 15:22:50.745379 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6vcq2" event={"ID":"8ba46466-b881-42f5-a24f-d4bdd1880169","Type":"ContainerDied","Data":"dc2866559e8105cac7c3baef4bd7858fdc54458c87c23db4bd1984ba08a898da"} Oct 03 15:22:50 crc kubenswrapper[4636]: I1003 15:22:50.746723 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:22:52 crc kubenswrapper[4636]: I1003 15:22:52.769620 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6vcq2" event={"ID":"8ba46466-b881-42f5-a24f-d4bdd1880169","Type":"ContainerStarted","Data":"156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf"} Oct 03 15:22:53 crc kubenswrapper[4636]: I1003 15:22:53.779921 4636 generic.go:334] "Generic (PLEG): container finished" podID="8ba46466-b881-42f5-a24f-d4bdd1880169" containerID="156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf" exitCode=0 Oct 03 15:22:53 crc kubenswrapper[4636]: I1003 15:22:53.780034 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6vcq2" event={"ID":"8ba46466-b881-42f5-a24f-d4bdd1880169","Type":"ContainerDied","Data":"156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf"} Oct 03 15:22:54 crc kubenswrapper[4636]: I1003 15:22:54.811155 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6vcq2" event={"ID":"8ba46466-b881-42f5-a24f-d4bdd1880169","Type":"ContainerStarted","Data":"332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0"} Oct 03 15:22:54 crc kubenswrapper[4636]: I1003 15:22:54.845531 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6vcq2" podStartSLOduration=3.384388582 podStartE2EDuration="6.845510124s" podCreationTimestamp="2025-10-03 15:22:48 +0000 UTC" firstStartedPulling="2025-10-03 15:22:50.746527164 +0000 UTC m=+4920.605253411" lastFinishedPulling="2025-10-03 15:22:54.207648706 +0000 UTC m=+4924.066374953" observedRunningTime="2025-10-03 15:22:54.840064521 +0000 UTC m=+4924.698790768" watchObservedRunningTime="2025-10-03 15:22:54.845510124 +0000 UTC m=+4924.704236371" Oct 03 15:22:58 crc kubenswrapper[4636]: I1003 15:22:58.695389 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:58 crc kubenswrapper[4636]: I1003 15:22:58.696979 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:22:58 crc kubenswrapper[4636]: I1003 15:22:58.749675 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:23:00 crc kubenswrapper[4636]: I1003 15:23:00.031840 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:23:00 crc kubenswrapper[4636]: I1003 15:23:00.077891 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6vcq2"] Oct 03 15:23:01 crc kubenswrapper[4636]: I1003 15:23:01.871691 4636 generic.go:334] "Generic (PLEG): container finished" podID="df99b419-f085-4d93-9a9b-c68e56153aeb" containerID="5c1727e3cadf5d87e4029a8b2a00b0a4b180bed513c73a55840772a7137dfeb7" exitCode=0 Oct 03 15:23:01 crc kubenswrapper[4636]: I1003 15:23:01.871897 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2hg7/must-gather-wd58h" event={"ID":"df99b419-f085-4d93-9a9b-c68e56153aeb","Type":"ContainerDied","Data":"5c1727e3cadf5d87e4029a8b2a00b0a4b180bed513c73a55840772a7137dfeb7"} Oct 03 15:23:01 crc kubenswrapper[4636]: I1003 15:23:01.872259 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6vcq2" podUID="8ba46466-b881-42f5-a24f-d4bdd1880169" containerName="registry-server" containerID="cri-o://332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0" gracePeriod=2 Oct 03 15:23:01 crc kubenswrapper[4636]: I1003 15:23:01.873000 4636 scope.go:117] "RemoveContainer" containerID="5c1727e3cadf5d87e4029a8b2a00b0a4b180bed513c73a55840772a7137dfeb7" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.313033 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.483846 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpk4r\" (UniqueName: \"kubernetes.io/projected/8ba46466-b881-42f5-a24f-d4bdd1880169-kube-api-access-cpk4r\") pod \"8ba46466-b881-42f5-a24f-d4bdd1880169\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.483893 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-catalog-content\") pod \"8ba46466-b881-42f5-a24f-d4bdd1880169\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.484854 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-utilities\") pod \"8ba46466-b881-42f5-a24f-d4bdd1880169\" (UID: \"8ba46466-b881-42f5-a24f-d4bdd1880169\") " Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.486042 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-utilities" (OuterVolumeSpecName: "utilities") pod "8ba46466-b881-42f5-a24f-d4bdd1880169" (UID: "8ba46466-b881-42f5-a24f-d4bdd1880169"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.490607 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba46466-b881-42f5-a24f-d4bdd1880169-kube-api-access-cpk4r" (OuterVolumeSpecName: "kube-api-access-cpk4r") pod "8ba46466-b881-42f5-a24f-d4bdd1880169" (UID: "8ba46466-b881-42f5-a24f-d4bdd1880169"). InnerVolumeSpecName "kube-api-access-cpk4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.536038 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ba46466-b881-42f5-a24f-d4bdd1880169" (UID: "8ba46466-b881-42f5-a24f-d4bdd1880169"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.587231 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.587568 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpk4r\" (UniqueName: \"kubernetes.io/projected/8ba46466-b881-42f5-a24f-d4bdd1880169-kube-api-access-cpk4r\") on node \"crc\" DevicePath \"\"" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.587582 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba46466-b881-42f5-a24f-d4bdd1880169-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.720315 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r2hg7_must-gather-wd58h_df99b419-f085-4d93-9a9b-c68e56153aeb/gather/0.log" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.887138 4636 generic.go:334] "Generic (PLEG): container finished" podID="8ba46466-b881-42f5-a24f-d4bdd1880169" containerID="332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0" exitCode=0 Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.887177 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6vcq2" event={"ID":"8ba46466-b881-42f5-a24f-d4bdd1880169","Type":"ContainerDied","Data":"332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0"} Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.887207 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6vcq2" event={"ID":"8ba46466-b881-42f5-a24f-d4bdd1880169","Type":"ContainerDied","Data":"2e6670f35afa6f7284c3660585b81510da046dab098e11476c1de623a133c18d"} Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.887224 4636 scope.go:117] "RemoveContainer" containerID="332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.887404 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6vcq2" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.922812 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6vcq2"] Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.926311 4636 scope.go:117] "RemoveContainer" containerID="156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf" Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.934558 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6vcq2"] Oct 03 15:23:02 crc kubenswrapper[4636]: I1003 15:23:02.952241 4636 scope.go:117] "RemoveContainer" containerID="dc2866559e8105cac7c3baef4bd7858fdc54458c87c23db4bd1984ba08a898da" Oct 03 15:23:03 crc kubenswrapper[4636]: I1003 15:23:03.029003 4636 scope.go:117] "RemoveContainer" containerID="332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0" Oct 03 15:23:03 crc kubenswrapper[4636]: E1003 15:23:03.029633 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0\": container with ID starting with 332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0 not found: ID does not exist" containerID="332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0" Oct 03 15:23:03 crc kubenswrapper[4636]: I1003 15:23:03.029667 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0"} err="failed to get container status \"332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0\": rpc error: code = NotFound desc = could not find container \"332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0\": container with ID starting with 332441f6c3d53472111d82e8fdb7fda7b2d8a79e7e16227c28c5b155a7b9bab0 not found: ID does not exist" Oct 03 15:23:03 crc kubenswrapper[4636]: I1003 15:23:03.029688 4636 scope.go:117] "RemoveContainer" containerID="156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf" Oct 03 15:23:03 crc kubenswrapper[4636]: E1003 15:23:03.030152 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf\": container with ID starting with 156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf not found: ID does not exist" containerID="156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf" Oct 03 15:23:03 crc kubenswrapper[4636]: I1003 15:23:03.030174 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf"} err="failed to get container status \"156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf\": rpc error: code = NotFound desc = could not find container \"156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf\": container with ID starting with 156e9eeee5a7550402d293989151d3ba20957c9e50d5f4526b82ca3cb5386dbf not found: ID does not exist" Oct 03 15:23:03 crc kubenswrapper[4636]: I1003 15:23:03.030188 4636 scope.go:117] "RemoveContainer" containerID="dc2866559e8105cac7c3baef4bd7858fdc54458c87c23db4bd1984ba08a898da" Oct 03 15:23:03 crc kubenswrapper[4636]: E1003 15:23:03.030453 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc2866559e8105cac7c3baef4bd7858fdc54458c87c23db4bd1984ba08a898da\": container with ID starting with dc2866559e8105cac7c3baef4bd7858fdc54458c87c23db4bd1984ba08a898da not found: ID does not exist" containerID="dc2866559e8105cac7c3baef4bd7858fdc54458c87c23db4bd1984ba08a898da" Oct 03 15:23:03 crc kubenswrapper[4636]: I1003 15:23:03.030484 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2866559e8105cac7c3baef4bd7858fdc54458c87c23db4bd1984ba08a898da"} err="failed to get container status \"dc2866559e8105cac7c3baef4bd7858fdc54458c87c23db4bd1984ba08a898da\": rpc error: code = NotFound desc = could not find container \"dc2866559e8105cac7c3baef4bd7858fdc54458c87c23db4bd1984ba08a898da\": container with ID starting with dc2866559e8105cac7c3baef4bd7858fdc54458c87c23db4bd1984ba08a898da not found: ID does not exist" Oct 03 15:23:04 crc kubenswrapper[4636]: I1003 15:23:04.804214 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba46466-b881-42f5-a24f-d4bdd1880169" path="/var/lib/kubelet/pods/8ba46466-b881-42f5-a24f-d4bdd1880169/volumes" Oct 03 15:23:09 crc kubenswrapper[4636]: I1003 15:23:09.163566 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:23:09 crc kubenswrapper[4636]: I1003 15:23:09.164089 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:23:11 crc kubenswrapper[4636]: I1003 15:23:11.185819 4636 scope.go:117] "RemoveContainer" containerID="3e567e1edf4c7ea08dbdc8f68993bf0900be4dd3b9ce68e257ebf3f3bc99c27f" Oct 03 15:23:11 crc kubenswrapper[4636]: I1003 15:23:11.600358 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r2hg7/must-gather-wd58h"] Oct 03 15:23:11 crc kubenswrapper[4636]: I1003 15:23:11.600952 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-r2hg7/must-gather-wd58h" podUID="df99b419-f085-4d93-9a9b-c68e56153aeb" containerName="copy" containerID="cri-o://163e2f5889616e6ae9ca30297e0266fa0c20a71bdf7b7fec91cf28eac1f88990" gracePeriod=2 Oct 03 15:23:11 crc kubenswrapper[4636]: I1003 15:23:11.613816 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r2hg7/must-gather-wd58h"] Oct 03 15:23:11 crc kubenswrapper[4636]: I1003 15:23:11.981875 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r2hg7_must-gather-wd58h_df99b419-f085-4d93-9a9b-c68e56153aeb/copy/0.log" Oct 03 15:23:11 crc kubenswrapper[4636]: I1003 15:23:11.982738 4636 generic.go:334] "Generic (PLEG): container finished" podID="df99b419-f085-4d93-9a9b-c68e56153aeb" containerID="163e2f5889616e6ae9ca30297e0266fa0c20a71bdf7b7fec91cf28eac1f88990" exitCode=143 Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.120838 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r2hg7_must-gather-wd58h_df99b419-f085-4d93-9a9b-c68e56153aeb/copy/0.log" Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.121197 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/must-gather-wd58h" Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.268865 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfp45\" (UniqueName: \"kubernetes.io/projected/df99b419-f085-4d93-9a9b-c68e56153aeb-kube-api-access-lfp45\") pod \"df99b419-f085-4d93-9a9b-c68e56153aeb\" (UID: \"df99b419-f085-4d93-9a9b-c68e56153aeb\") " Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.269261 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/df99b419-f085-4d93-9a9b-c68e56153aeb-must-gather-output\") pod \"df99b419-f085-4d93-9a9b-c68e56153aeb\" (UID: \"df99b419-f085-4d93-9a9b-c68e56153aeb\") " Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.478001 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df99b419-f085-4d93-9a9b-c68e56153aeb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "df99b419-f085-4d93-9a9b-c68e56153aeb" (UID: "df99b419-f085-4d93-9a9b-c68e56153aeb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.575360 4636 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/df99b419-f085-4d93-9a9b-c68e56153aeb-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.591458 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df99b419-f085-4d93-9a9b-c68e56153aeb-kube-api-access-lfp45" (OuterVolumeSpecName: "kube-api-access-lfp45") pod "df99b419-f085-4d93-9a9b-c68e56153aeb" (UID: "df99b419-f085-4d93-9a9b-c68e56153aeb"). InnerVolumeSpecName "kube-api-access-lfp45". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.677268 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfp45\" (UniqueName: \"kubernetes.io/projected/df99b419-f085-4d93-9a9b-c68e56153aeb-kube-api-access-lfp45\") on node \"crc\" DevicePath \"\"" Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.804215 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df99b419-f085-4d93-9a9b-c68e56153aeb" path="/var/lib/kubelet/pods/df99b419-f085-4d93-9a9b-c68e56153aeb/volumes" Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.991284 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r2hg7_must-gather-wd58h_df99b419-f085-4d93-9a9b-c68e56153aeb/copy/0.log" Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.992009 4636 scope.go:117] "RemoveContainer" containerID="163e2f5889616e6ae9ca30297e0266fa0c20a71bdf7b7fec91cf28eac1f88990" Oct 03 15:23:12 crc kubenswrapper[4636]: I1003 15:23:12.992180 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2hg7/must-gather-wd58h" Oct 03 15:23:13 crc kubenswrapper[4636]: I1003 15:23:13.016024 4636 scope.go:117] "RemoveContainer" containerID="5c1727e3cadf5d87e4029a8b2a00b0a4b180bed513c73a55840772a7137dfeb7" Oct 03 15:23:39 crc kubenswrapper[4636]: I1003 15:23:39.163453 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:23:39 crc kubenswrapper[4636]: I1003 15:23:39.164024 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.393523 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lcgn5/must-gather-77vjr"] Oct 03 15:24:00 crc kubenswrapper[4636]: E1003 15:24:00.394613 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba46466-b881-42f5-a24f-d4bdd1880169" containerName="extract-content" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.394630 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba46466-b881-42f5-a24f-d4bdd1880169" containerName="extract-content" Oct 03 15:24:00 crc kubenswrapper[4636]: E1003 15:24:00.394647 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df99b419-f085-4d93-9a9b-c68e56153aeb" containerName="copy" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.394654 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="df99b419-f085-4d93-9a9b-c68e56153aeb" containerName="copy" Oct 03 15:24:00 crc kubenswrapper[4636]: E1003 15:24:00.394666 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba46466-b881-42f5-a24f-d4bdd1880169" containerName="registry-server" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.394673 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba46466-b881-42f5-a24f-d4bdd1880169" containerName="registry-server" Oct 03 15:24:00 crc kubenswrapper[4636]: E1003 15:24:00.394705 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba46466-b881-42f5-a24f-d4bdd1880169" containerName="extract-utilities" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.394714 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba46466-b881-42f5-a24f-d4bdd1880169" containerName="extract-utilities" Oct 03 15:24:00 crc kubenswrapper[4636]: E1003 15:24:00.394728 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df99b419-f085-4d93-9a9b-c68e56153aeb" containerName="gather" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.394736 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="df99b419-f085-4d93-9a9b-c68e56153aeb" containerName="gather" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.394958 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="df99b419-f085-4d93-9a9b-c68e56153aeb" containerName="gather" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.394988 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="df99b419-f085-4d93-9a9b-c68e56153aeb" containerName="copy" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.395008 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba46466-b881-42f5-a24f-d4bdd1880169" containerName="registry-server" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.396451 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/must-gather-77vjr" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.401181 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lcgn5"/"default-dockercfg-v7n9r" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.403169 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lcgn5"/"kube-root-ca.crt" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.403739 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lcgn5"/"openshift-service-ca.crt" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.501987 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7xsb\" (UniqueName: \"kubernetes.io/projected/472a5287-7186-4eb6-8bd6-ce986291af01-kube-api-access-b7xsb\") pod \"must-gather-77vjr\" (UID: \"472a5287-7186-4eb6-8bd6-ce986291af01\") " pod="openshift-must-gather-lcgn5/must-gather-77vjr" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.502079 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/472a5287-7186-4eb6-8bd6-ce986291af01-must-gather-output\") pod \"must-gather-77vjr\" (UID: \"472a5287-7186-4eb6-8bd6-ce986291af01\") " pod="openshift-must-gather-lcgn5/must-gather-77vjr" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.603724 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/472a5287-7186-4eb6-8bd6-ce986291af01-must-gather-output\") pod \"must-gather-77vjr\" (UID: \"472a5287-7186-4eb6-8bd6-ce986291af01\") " pod="openshift-must-gather-lcgn5/must-gather-77vjr" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.603937 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7xsb\" (UniqueName: \"kubernetes.io/projected/472a5287-7186-4eb6-8bd6-ce986291af01-kube-api-access-b7xsb\") pod \"must-gather-77vjr\" (UID: \"472a5287-7186-4eb6-8bd6-ce986291af01\") " pod="openshift-must-gather-lcgn5/must-gather-77vjr" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.604719 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/472a5287-7186-4eb6-8bd6-ce986291af01-must-gather-output\") pod \"must-gather-77vjr\" (UID: \"472a5287-7186-4eb6-8bd6-ce986291af01\") " pod="openshift-must-gather-lcgn5/must-gather-77vjr" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.637370 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7xsb\" (UniqueName: \"kubernetes.io/projected/472a5287-7186-4eb6-8bd6-ce986291af01-kube-api-access-b7xsb\") pod \"must-gather-77vjr\" (UID: \"472a5287-7186-4eb6-8bd6-ce986291af01\") " pod="openshift-must-gather-lcgn5/must-gather-77vjr" Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.690707 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lcgn5/must-gather-77vjr"] Oct 03 15:24:00 crc kubenswrapper[4636]: I1003 15:24:00.722318 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/must-gather-77vjr" Oct 03 15:24:02 crc kubenswrapper[4636]: I1003 15:24:02.029525 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lcgn5/must-gather-77vjr"] Oct 03 15:24:02 crc kubenswrapper[4636]: W1003 15:24:02.036649 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod472a5287_7186_4eb6_8bd6_ce986291af01.slice/crio-b2192c0cca646f958abb54180040a0930802307514cf5a4dce250dbf1bcf06ba WatchSource:0}: Error finding container b2192c0cca646f958abb54180040a0930802307514cf5a4dce250dbf1bcf06ba: Status 404 returned error can't find the container with id b2192c0cca646f958abb54180040a0930802307514cf5a4dce250dbf1bcf06ba Oct 03 15:24:02 crc kubenswrapper[4636]: I1003 15:24:02.483729 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/must-gather-77vjr" event={"ID":"472a5287-7186-4eb6-8bd6-ce986291af01","Type":"ContainerStarted","Data":"b2192c0cca646f958abb54180040a0930802307514cf5a4dce250dbf1bcf06ba"} Oct 03 15:24:03 crc kubenswrapper[4636]: I1003 15:24:03.494680 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/must-gather-77vjr" event={"ID":"472a5287-7186-4eb6-8bd6-ce986291af01","Type":"ContainerStarted","Data":"3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae"} Oct 03 15:24:04 crc kubenswrapper[4636]: I1003 15:24:04.506858 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/must-gather-77vjr" event={"ID":"472a5287-7186-4eb6-8bd6-ce986291af01","Type":"ContainerStarted","Data":"b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50"} Oct 03 15:24:04 crc kubenswrapper[4636]: I1003 15:24:04.530793 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lcgn5/must-gather-77vjr" podStartSLOduration=4.530773836 podStartE2EDuration="4.530773836s" podCreationTimestamp="2025-10-03 15:24:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:24:04.52864031 +0000 UTC m=+4994.387366557" watchObservedRunningTime="2025-10-03 15:24:04.530773836 +0000 UTC m=+4994.389500083" Oct 03 15:24:07 crc kubenswrapper[4636]: I1003 15:24:07.408695 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lcgn5/crc-debug-qd9d2"] Oct 03 15:24:07 crc kubenswrapper[4636]: I1003 15:24:07.411653 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" Oct 03 15:24:07 crc kubenswrapper[4636]: I1003 15:24:07.551448 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fca82f9-997b-4b27-a9bc-3b405a91f511-host\") pod \"crc-debug-qd9d2\" (UID: \"1fca82f9-997b-4b27-a9bc-3b405a91f511\") " pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" Oct 03 15:24:07 crc kubenswrapper[4636]: I1003 15:24:07.551570 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s5xk\" (UniqueName: \"kubernetes.io/projected/1fca82f9-997b-4b27-a9bc-3b405a91f511-kube-api-access-8s5xk\") pod \"crc-debug-qd9d2\" (UID: \"1fca82f9-997b-4b27-a9bc-3b405a91f511\") " pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" Oct 03 15:24:07 crc kubenswrapper[4636]: I1003 15:24:07.653025 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s5xk\" (UniqueName: \"kubernetes.io/projected/1fca82f9-997b-4b27-a9bc-3b405a91f511-kube-api-access-8s5xk\") pod \"crc-debug-qd9d2\" (UID: \"1fca82f9-997b-4b27-a9bc-3b405a91f511\") " pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" Oct 03 15:24:07 crc kubenswrapper[4636]: I1003 15:24:07.653209 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fca82f9-997b-4b27-a9bc-3b405a91f511-host\") pod \"crc-debug-qd9d2\" (UID: \"1fca82f9-997b-4b27-a9bc-3b405a91f511\") " pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" Oct 03 15:24:07 crc kubenswrapper[4636]: I1003 15:24:07.653352 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fca82f9-997b-4b27-a9bc-3b405a91f511-host\") pod \"crc-debug-qd9d2\" (UID: \"1fca82f9-997b-4b27-a9bc-3b405a91f511\") " pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" Oct 03 15:24:07 crc kubenswrapper[4636]: I1003 15:24:07.674291 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s5xk\" (UniqueName: \"kubernetes.io/projected/1fca82f9-997b-4b27-a9bc-3b405a91f511-kube-api-access-8s5xk\") pod \"crc-debug-qd9d2\" (UID: \"1fca82f9-997b-4b27-a9bc-3b405a91f511\") " pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" Oct 03 15:24:07 crc kubenswrapper[4636]: I1003 15:24:07.735587 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" Oct 03 15:24:07 crc kubenswrapper[4636]: W1003 15:24:07.764299 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fca82f9_997b_4b27_a9bc_3b405a91f511.slice/crio-aae5bab69602a4f8abf76bbff84fe4940a4c5f301c2e304abbd24b06a78b8c78 WatchSource:0}: Error finding container aae5bab69602a4f8abf76bbff84fe4940a4c5f301c2e304abbd24b06a78b8c78: Status 404 returned error can't find the container with id aae5bab69602a4f8abf76bbff84fe4940a4c5f301c2e304abbd24b06a78b8c78 Oct 03 15:24:08 crc kubenswrapper[4636]: I1003 15:24:08.572228 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" event={"ID":"1fca82f9-997b-4b27-a9bc-3b405a91f511","Type":"ContainerStarted","Data":"e290993850c0c5ca851f80f706db67ad321a11fe4949e7ec53deb57461433959"} Oct 03 15:24:08 crc kubenswrapper[4636]: I1003 15:24:08.572638 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" event={"ID":"1fca82f9-997b-4b27-a9bc-3b405a91f511","Type":"ContainerStarted","Data":"aae5bab69602a4f8abf76bbff84fe4940a4c5f301c2e304abbd24b06a78b8c78"} Oct 03 15:24:08 crc kubenswrapper[4636]: I1003 15:24:08.595608 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" podStartSLOduration=1.5955878060000002 podStartE2EDuration="1.595587806s" podCreationTimestamp="2025-10-03 15:24:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:24:08.586336243 +0000 UTC m=+4998.445062490" watchObservedRunningTime="2025-10-03 15:24:08.595587806 +0000 UTC m=+4998.454314053" Oct 03 15:24:09 crc kubenswrapper[4636]: I1003 15:24:09.163245 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:24:09 crc kubenswrapper[4636]: I1003 15:24:09.163628 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:24:09 crc kubenswrapper[4636]: I1003 15:24:09.163680 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 15:24:09 crc kubenswrapper[4636]: I1003 15:24:09.164484 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:24:09 crc kubenswrapper[4636]: I1003 15:24:09.164555 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" gracePeriod=600 Oct 03 15:24:09 crc kubenswrapper[4636]: E1003 15:24:09.302031 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:24:09 crc kubenswrapper[4636]: I1003 15:24:09.582873 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" exitCode=0 Oct 03 15:24:09 crc kubenswrapper[4636]: I1003 15:24:09.582921 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2"} Oct 03 15:24:09 crc kubenswrapper[4636]: I1003 15:24:09.582960 4636 scope.go:117] "RemoveContainer" containerID="913c176d23c8c8c6a552be6bf2fd2627170ae4a3ef1ef4abe575bb231ee8bb69" Oct 03 15:24:09 crc kubenswrapper[4636]: I1003 15:24:09.583645 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:24:09 crc kubenswrapper[4636]: E1003 15:24:09.583951 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:24:21 crc kubenswrapper[4636]: I1003 15:24:21.794217 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:24:21 crc kubenswrapper[4636]: E1003 15:24:21.795065 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:24:35 crc kubenswrapper[4636]: I1003 15:24:35.793896 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:24:35 crc kubenswrapper[4636]: E1003 15:24:35.794611 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:24:49 crc kubenswrapper[4636]: I1003 15:24:49.793867 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:24:49 crc kubenswrapper[4636]: E1003 15:24:49.794720 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:25:03 crc kubenswrapper[4636]: I1003 15:25:03.794733 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:25:03 crc kubenswrapper[4636]: E1003 15:25:03.795484 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:25:11 crc kubenswrapper[4636]: I1003 15:25:11.314204 4636 scope.go:117] "RemoveContainer" containerID="377ad8605e112984615ac4c1c7e93bd3cd68dcdb20bf7cc5f4982dea7eb3ec29" Oct 03 15:25:18 crc kubenswrapper[4636]: I1003 15:25:18.794657 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:25:18 crc kubenswrapper[4636]: E1003 15:25:18.795545 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:25:33 crc kubenswrapper[4636]: I1003 15:25:33.794084 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:25:33 crc kubenswrapper[4636]: E1003 15:25:33.794773 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:25:35 crc kubenswrapper[4636]: I1003 15:25:35.831983 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f548d674d-2q8gg_ee5faec7-3829-49b6-aca7-452f5eae6a67/barbican-api/0.log" Oct 03 15:25:35 crc kubenswrapper[4636]: I1003 15:25:35.894422 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f548d674d-2q8gg_ee5faec7-3829-49b6-aca7-452f5eae6a67/barbican-api-log/0.log" Oct 03 15:25:36 crc kubenswrapper[4636]: I1003 15:25:36.062389 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f6878bdf6-2vf98_9cf408a6-c7e6-4bf3-80c6-47cc10bec465/barbican-keystone-listener/0.log" Oct 03 15:25:36 crc kubenswrapper[4636]: I1003 15:25:36.151307 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f6878bdf6-2vf98_9cf408a6-c7e6-4bf3-80c6-47cc10bec465/barbican-keystone-listener-log/0.log" Oct 03 15:25:36 crc kubenswrapper[4636]: I1003 15:25:36.253282 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f76699687-g9k2g_596e3078-e359-4e8d-a7c0-74c710f2c2f9/barbican-worker/0.log" Oct 03 15:25:36 crc kubenswrapper[4636]: I1003 15:25:36.400781 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f76699687-g9k2g_596e3078-e359-4e8d-a7c0-74c710f2c2f9/barbican-worker-log/0.log" Oct 03 15:25:36 crc kubenswrapper[4636]: I1003 15:25:36.610371 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4srhr_57d50548-733b-4696-9e0f-fc749406a055/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:36 crc kubenswrapper[4636]: I1003 15:25:36.836655 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e/ceilometer-central-agent/0.log" Oct 03 15:25:36 crc kubenswrapper[4636]: I1003 15:25:36.899468 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e/ceilometer-notification-agent/0.log" Oct 03 15:25:36 crc kubenswrapper[4636]: I1003 15:25:36.965193 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e/proxy-httpd/0.log" Oct 03 15:25:37 crc kubenswrapper[4636]: I1003 15:25:37.101070 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffb7772f-60ac-4eeb-ac8d-09965dcb4b3e/sg-core/0.log" Oct 03 15:25:37 crc kubenswrapper[4636]: I1003 15:25:37.247935 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7a3dbfb9-f2b2-4725-9960-07d3fb89125e/cinder-api/0.log" Oct 03 15:25:37 crc kubenswrapper[4636]: I1003 15:25:37.374268 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7a3dbfb9-f2b2-4725-9960-07d3fb89125e/cinder-api-log/0.log" Oct 03 15:25:37 crc kubenswrapper[4636]: I1003 15:25:37.498259 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_94180bad-9d72-4d67-aefa-1fd7a9d886ac/cinder-scheduler/0.log" Oct 03 15:25:37 crc kubenswrapper[4636]: I1003 15:25:37.668376 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_94180bad-9d72-4d67-aefa-1fd7a9d886ac/probe/0.log" Oct 03 15:25:37 crc kubenswrapper[4636]: I1003 15:25:37.793596 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-qplrd_9781ac24-d39e-4e00-b2e8-3eac5f120090/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:38 crc kubenswrapper[4636]: I1003 15:25:38.076183 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kn84r_ca12d2cd-3187-4910-9e28-2f977be4bcf8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:38 crc kubenswrapper[4636]: I1003 15:25:38.192690 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-wdcqk_e872c241-3445-4382-a7f0-1a15d6a223c2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:38 crc kubenswrapper[4636]: I1003 15:25:38.387992 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d7677974f-dtkft_317017e9-687f-4a84-b896-fab84c269e2b/init/0.log" Oct 03 15:25:38 crc kubenswrapper[4636]: I1003 15:25:38.605587 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d7677974f-dtkft_317017e9-687f-4a84-b896-fab84c269e2b/init/0.log" Oct 03 15:25:38 crc kubenswrapper[4636]: I1003 15:25:38.781251 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d7677974f-dtkft_317017e9-687f-4a84-b896-fab84c269e2b/dnsmasq-dns/0.log" Oct 03 15:25:38 crc kubenswrapper[4636]: I1003 15:25:38.903094 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-f99bb_a1c24630-7d57-45b9-8bdd-fb45d6a74c61/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:39 crc kubenswrapper[4636]: I1003 15:25:39.134409 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_48268aa0-45d6-42d4-a902-6f9221eae8d7/glance-httpd/0.log" Oct 03 15:25:39 crc kubenswrapper[4636]: I1003 15:25:39.213020 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_48268aa0-45d6-42d4-a902-6f9221eae8d7/glance-log/0.log" Oct 03 15:25:39 crc kubenswrapper[4636]: I1003 15:25:39.363762 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4/glance-log/0.log" Oct 03 15:25:39 crc kubenswrapper[4636]: I1003 15:25:39.423986 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8f9d57db-69a6-4123-bcd5-a1b83b1c9cc4/glance-httpd/0.log" Oct 03 15:25:39 crc kubenswrapper[4636]: I1003 15:25:39.676376 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c5bc9456-rfvns_0025da7c-17f3-4036-a9fc-3330508c11cd/horizon/1.log" Oct 03 15:25:39 crc kubenswrapper[4636]: I1003 15:25:39.730669 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c5bc9456-rfvns_0025da7c-17f3-4036-a9fc-3330508c11cd/horizon/0.log" Oct 03 15:25:40 crc kubenswrapper[4636]: I1003 15:25:40.023993 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-rldfm_9634671b-cf60-4cdf-9558-417432ff5401/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:40 crc kubenswrapper[4636]: I1003 15:25:40.181980 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-r9sdw_52b89ba7-3476-42ae-aa47-fb7a38732669/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:40 crc kubenswrapper[4636]: I1003 15:25:40.317034 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-8c5bc9456-rfvns_0025da7c-17f3-4036-a9fc-3330508c11cd/horizon-log/0.log" Oct 03 15:25:40 crc kubenswrapper[4636]: I1003 15:25:40.530138 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29325061-4mrxv_e2f61a03-c4e7-414d-b6f9-b1f920d35757/keystone-cron/0.log" Oct 03 15:25:40 crc kubenswrapper[4636]: I1003 15:25:40.748870 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f13ed4ac-9670-4c2d-9dd7-3b49c53b14b0/kube-state-metrics/0.log" Oct 03 15:25:40 crc kubenswrapper[4636]: I1003 15:25:40.878139 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7b4f64b6bf-z54p6_5ddc1097-69d8-4db3-93f1-a43038191aae/keystone-api/0.log" Oct 03 15:25:40 crc kubenswrapper[4636]: I1003 15:25:40.961460 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-lpqnk_917285a7-3281-4326-8837-f1db2fe9a711/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:41 crc kubenswrapper[4636]: I1003 15:25:41.815751 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d7d56d58f-cswwm_58fac2cb-4974-4241-8a11-77ad13d22306/neutron-httpd/0.log" Oct 03 15:25:41 crc kubenswrapper[4636]: I1003 15:25:41.826616 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-cvzf2_4932588e-72ae-44a2-bc95-08cd792a140f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:41 crc kubenswrapper[4636]: I1003 15:25:41.980748 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d7d56d58f-cswwm_58fac2cb-4974-4241-8a11-77ad13d22306/neutron-api/0.log" Oct 03 15:25:43 crc kubenswrapper[4636]: I1003 15:25:43.072159 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_efbece4d-3b40-41b8-819a-9dac3cf42b21/nova-cell0-conductor-conductor/0.log" Oct 03 15:25:43 crc kubenswrapper[4636]: I1003 15:25:43.646429 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8c04651e-c4ab-4322-ae46-6ee8a115ed64/nova-api-log/0.log" Oct 03 15:25:43 crc kubenswrapper[4636]: I1003 15:25:43.953465 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8c04651e-c4ab-4322-ae46-6ee8a115ed64/nova-api-api/0.log" Oct 03 15:25:44 crc kubenswrapper[4636]: I1003 15:25:44.013905 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d3c3ee37-3ab9-49b5-b1e0-763a5155bd8d/nova-cell1-conductor-conductor/0.log" Oct 03 15:25:44 crc kubenswrapper[4636]: I1003 15:25:44.435822 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-ggwhp_ee4e092c-de87-4547-a39a-1a451ef9dc64/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:44 crc kubenswrapper[4636]: I1003 15:25:44.442277 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_68498e63-11ab-4746-ae7f-01662c1e136f/nova-cell1-novncproxy-novncproxy/0.log" Oct 03 15:25:44 crc kubenswrapper[4636]: I1003 15:25:44.828518 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4c9bc86e-3770-40e9-bf37-80627278032b/nova-metadata-log/0.log" Oct 03 15:25:45 crc kubenswrapper[4636]: I1003 15:25:45.593631 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b3439f9c-0086-413d-a84f-79e7da2ffcbd/mysql-bootstrap/0.log" Oct 03 15:25:45 crc kubenswrapper[4636]: I1003 15:25:45.772331 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_493bf5be-a62b-4d5e-8de8-082ab7d23842/nova-scheduler-scheduler/0.log" Oct 03 15:25:45 crc kubenswrapper[4636]: I1003 15:25:45.918694 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b3439f9c-0086-413d-a84f-79e7da2ffcbd/mysql-bootstrap/0.log" Oct 03 15:25:46 crc kubenswrapper[4636]: I1003 15:25:46.037358 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b3439f9c-0086-413d-a84f-79e7da2ffcbd/galera/0.log" Oct 03 15:25:46 crc kubenswrapper[4636]: I1003 15:25:46.398959 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_781432ad-b393-4271-8a8a-39254e422cd4/mysql-bootstrap/0.log" Oct 03 15:25:46 crc kubenswrapper[4636]: I1003 15:25:46.626375 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_781432ad-b393-4271-8a8a-39254e422cd4/mysql-bootstrap/0.log" Oct 03 15:25:46 crc kubenswrapper[4636]: I1003 15:25:46.702560 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_781432ad-b393-4271-8a8a-39254e422cd4/galera/0.log" Oct 03 15:25:46 crc kubenswrapper[4636]: I1003 15:25:46.795488 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:25:46 crc kubenswrapper[4636]: E1003 15:25:46.795729 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:25:46 crc kubenswrapper[4636]: I1003 15:25:46.831073 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4c9bc86e-3770-40e9-bf37-80627278032b/nova-metadata-metadata/0.log" Oct 03 15:25:46 crc kubenswrapper[4636]: I1003 15:25:46.953976 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0a7aa438-f4f0-4975-a0e8-1005b56f8957/openstackclient/0.log" Oct 03 15:25:47 crc kubenswrapper[4636]: I1003 15:25:47.261748 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-2mfj2_62646db9-d39c-4cb1-b308-22dff51e4bcf/ovn-controller/0.log" Oct 03 15:25:47 crc kubenswrapper[4636]: I1003 15:25:47.841287 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4f5ff_291d0189-08a0-4b8b-8406-8601de0e3708/openstack-network-exporter/0.log" Oct 03 15:25:48 crc kubenswrapper[4636]: I1003 15:25:48.014855 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2pfz4_fc054158-e506-4945-b3da-50265dc1b1aa/ovsdb-server-init/0.log" Oct 03 15:25:48 crc kubenswrapper[4636]: I1003 15:25:48.335661 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2pfz4_fc054158-e506-4945-b3da-50265dc1b1aa/ovs-vswitchd/0.log" Oct 03 15:25:48 crc kubenswrapper[4636]: I1003 15:25:48.338757 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2pfz4_fc054158-e506-4945-b3da-50265dc1b1aa/ovsdb-server-init/0.log" Oct 03 15:25:48 crc kubenswrapper[4636]: I1003 15:25:48.344772 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2pfz4_fc054158-e506-4945-b3da-50265dc1b1aa/ovsdb-server/0.log" Oct 03 15:25:48 crc kubenswrapper[4636]: I1003 15:25:48.757336 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-frzc6_d1e8fa7f-c140-4196-8967-ca303b35e8c5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:48 crc kubenswrapper[4636]: I1003 15:25:48.888108 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9017beb0-a89a-4efa-b304-ee0ab7a8ce54/openstack-network-exporter/0.log" Oct 03 15:25:48 crc kubenswrapper[4636]: I1003 15:25:48.967429 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9017beb0-a89a-4efa-b304-ee0ab7a8ce54/ovn-northd/0.log" Oct 03 15:25:49 crc kubenswrapper[4636]: I1003 15:25:49.211610 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_af99ddda-1ae6-4b70-9422-06c99e8664e5/openstack-network-exporter/0.log" Oct 03 15:25:49 crc kubenswrapper[4636]: I1003 15:25:49.262010 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_af99ddda-1ae6-4b70-9422-06c99e8664e5/ovsdbserver-nb/0.log" Oct 03 15:25:49 crc kubenswrapper[4636]: I1003 15:25:49.522291 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2a4510e7-aa39-4e1f-80bb-196127d2643c/openstack-network-exporter/0.log" Oct 03 15:25:49 crc kubenswrapper[4636]: I1003 15:25:49.545517 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2a4510e7-aa39-4e1f-80bb-196127d2643c/ovsdbserver-sb/0.log" Oct 03 15:25:50 crc kubenswrapper[4636]: I1003 15:25:50.148177 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6796cf444-9xs6c_7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2/placement-api/0.log" Oct 03 15:25:50 crc kubenswrapper[4636]: I1003 15:25:50.204971 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6796cf444-9xs6c_7ab1d10b-e0d0-426a-a90f-6f8969e3c8b2/placement-log/0.log" Oct 03 15:25:50 crc kubenswrapper[4636]: I1003 15:25:50.416516 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e97eeb5a-f169-4c58-bda2-c727ca1f5126/setup-container/0.log" Oct 03 15:25:50 crc kubenswrapper[4636]: I1003 15:25:50.646826 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e97eeb5a-f169-4c58-bda2-c727ca1f5126/setup-container/0.log" Oct 03 15:25:50 crc kubenswrapper[4636]: I1003 15:25:50.852857 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e97eeb5a-f169-4c58-bda2-c727ca1f5126/rabbitmq/0.log" Oct 03 15:25:50 crc kubenswrapper[4636]: I1003 15:25:50.998128 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f7c3cb64-6553-4d95-8ccc-25f758b3cc97/setup-container/0.log" Oct 03 15:25:51 crc kubenswrapper[4636]: I1003 15:25:51.222951 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f7c3cb64-6553-4d95-8ccc-25f758b3cc97/setup-container/0.log" Oct 03 15:25:51 crc kubenswrapper[4636]: I1003 15:25:51.618312 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f7c3cb64-6553-4d95-8ccc-25f758b3cc97/rabbitmq/0.log" Oct 03 15:25:51 crc kubenswrapper[4636]: I1003 15:25:51.828716 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-82kv9_0057d92e-1564-4b8e-93e9-aee9f862501e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:51 crc kubenswrapper[4636]: I1003 15:25:51.872006 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-4vk2h_e7ae7cb3-1588-4c70-92e2-942cef9d9b0a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:52 crc kubenswrapper[4636]: I1003 15:25:52.232944 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-z2rxw_9eb85b02-3bf8-4fe8-a060-c3593e995499/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:52 crc kubenswrapper[4636]: I1003 15:25:52.552409 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-tlxw2_1af273b7-459c-4175-9085-28fa11fb76ee/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:52 crc kubenswrapper[4636]: I1003 15:25:52.652050 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-f24zs_4fa2c95f-4798-46d0-8e21-31334d585714/ssh-known-hosts-edpm-deployment/0.log" Oct 03 15:25:53 crc kubenswrapper[4636]: I1003 15:25:53.055429 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6766dbb747-7j5j7_e0a3acac-6d5f-49d7-9b2e-52bd155fb674/proxy-server/0.log" Oct 03 15:25:53 crc kubenswrapper[4636]: I1003 15:25:53.137067 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6766dbb747-7j5j7_e0a3acac-6d5f-49d7-9b2e-52bd155fb674/proxy-httpd/0.log" Oct 03 15:25:53 crc kubenswrapper[4636]: I1003 15:25:53.340247 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qz8ds_00eeeec0-4e4a-4e2c-aaa6-07a793372fd7/swift-ring-rebalance/0.log" Oct 03 15:25:53 crc kubenswrapper[4636]: I1003 15:25:53.530519 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/account-auditor/0.log" Oct 03 15:25:53 crc kubenswrapper[4636]: I1003 15:25:53.651770 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/account-reaper/0.log" Oct 03 15:25:53 crc kubenswrapper[4636]: I1003 15:25:53.746113 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/account-replicator/0.log" Oct 03 15:25:53 crc kubenswrapper[4636]: I1003 15:25:53.839928 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/account-server/0.log" Oct 03 15:25:53 crc kubenswrapper[4636]: I1003 15:25:53.944198 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/container-auditor/0.log" Oct 03 15:25:54 crc kubenswrapper[4636]: I1003 15:25:54.089329 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/container-server/0.log" Oct 03 15:25:54 crc kubenswrapper[4636]: I1003 15:25:54.168600 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/container-replicator/0.log" Oct 03 15:25:54 crc kubenswrapper[4636]: I1003 15:25:54.194214 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/container-updater/0.log" Oct 03 15:25:54 crc kubenswrapper[4636]: I1003 15:25:54.549185 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/object-expirer/0.log" Oct 03 15:25:54 crc kubenswrapper[4636]: I1003 15:25:54.553917 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/object-auditor/0.log" Oct 03 15:25:54 crc kubenswrapper[4636]: I1003 15:25:54.687801 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/object-replicator/0.log" Oct 03 15:25:54 crc kubenswrapper[4636]: I1003 15:25:54.865960 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/object-updater/0.log" Oct 03 15:25:54 crc kubenswrapper[4636]: I1003 15:25:54.949818 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/object-server/0.log" Oct 03 15:25:55 crc kubenswrapper[4636]: I1003 15:25:55.245676 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/rsync/0.log" Oct 03 15:25:55 crc kubenswrapper[4636]: I1003 15:25:55.487294 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_201b506e-9cc5-4ab0-9af4-96a357d19f6e/swift-recon-cron/0.log" Oct 03 15:25:55 crc kubenswrapper[4636]: I1003 15:25:55.673283 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4njmj_88e3290e-0c0d-4304-bcd2-b500068dc443/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:55 crc kubenswrapper[4636]: I1003 15:25:55.826434 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_76d391b3-cee3-4591-814b-a1b99bed1872/tempest-tests-tempest-tests-runner/0.log" Oct 03 15:25:56 crc kubenswrapper[4636]: I1003 15:25:56.014538 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d141c423-495c-4fa0-af39-06bd5c484253/test-operator-logs-container/0.log" Oct 03 15:25:56 crc kubenswrapper[4636]: I1003 15:25:56.337741 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-2jt64_baf6dabc-cac4-4e7c-9101-dcd5cfe39647/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 15:25:59 crc kubenswrapper[4636]: I1003 15:25:59.794241 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:25:59 crc kubenswrapper[4636]: E1003 15:25:59.794897 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:26:03 crc kubenswrapper[4636]: I1003 15:26:03.360774 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_17e09844-cd33-42a1-a0dc-e1995b872663/memcached/0.log" Oct 03 15:26:14 crc kubenswrapper[4636]: I1003 15:26:14.797830 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:26:14 crc kubenswrapper[4636]: E1003 15:26:14.799141 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:26:27 crc kubenswrapper[4636]: I1003 15:26:27.794552 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:26:27 crc kubenswrapper[4636]: E1003 15:26:27.795358 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:26:34 crc kubenswrapper[4636]: I1003 15:26:34.008412 4636 generic.go:334] "Generic (PLEG): container finished" podID="1fca82f9-997b-4b27-a9bc-3b405a91f511" containerID="e290993850c0c5ca851f80f706db67ad321a11fe4949e7ec53deb57461433959" exitCode=0 Oct 03 15:26:34 crc kubenswrapper[4636]: I1003 15:26:34.008518 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" event={"ID":"1fca82f9-997b-4b27-a9bc-3b405a91f511","Type":"ContainerDied","Data":"e290993850c0c5ca851f80f706db67ad321a11fe4949e7ec53deb57461433959"} Oct 03 15:26:35 crc kubenswrapper[4636]: I1003 15:26:35.128003 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" Oct 03 15:26:35 crc kubenswrapper[4636]: I1003 15:26:35.161667 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lcgn5/crc-debug-qd9d2"] Oct 03 15:26:35 crc kubenswrapper[4636]: I1003 15:26:35.163704 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s5xk\" (UniqueName: \"kubernetes.io/projected/1fca82f9-997b-4b27-a9bc-3b405a91f511-kube-api-access-8s5xk\") pod \"1fca82f9-997b-4b27-a9bc-3b405a91f511\" (UID: \"1fca82f9-997b-4b27-a9bc-3b405a91f511\") " Oct 03 15:26:35 crc kubenswrapper[4636]: I1003 15:26:35.164093 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fca82f9-997b-4b27-a9bc-3b405a91f511-host\") pod \"1fca82f9-997b-4b27-a9bc-3b405a91f511\" (UID: \"1fca82f9-997b-4b27-a9bc-3b405a91f511\") " Oct 03 15:26:35 crc kubenswrapper[4636]: I1003 15:26:35.164268 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fca82f9-997b-4b27-a9bc-3b405a91f511-host" (OuterVolumeSpecName: "host") pod "1fca82f9-997b-4b27-a9bc-3b405a91f511" (UID: "1fca82f9-997b-4b27-a9bc-3b405a91f511"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:26:35 crc kubenswrapper[4636]: I1003 15:26:35.164686 4636 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fca82f9-997b-4b27-a9bc-3b405a91f511-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:35 crc kubenswrapper[4636]: I1003 15:26:35.178847 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fca82f9-997b-4b27-a9bc-3b405a91f511-kube-api-access-8s5xk" (OuterVolumeSpecName: "kube-api-access-8s5xk") pod "1fca82f9-997b-4b27-a9bc-3b405a91f511" (UID: "1fca82f9-997b-4b27-a9bc-3b405a91f511"). InnerVolumeSpecName "kube-api-access-8s5xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:26:35 crc kubenswrapper[4636]: I1003 15:26:35.182956 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lcgn5/crc-debug-qd9d2"] Oct 03 15:26:35 crc kubenswrapper[4636]: I1003 15:26:35.266822 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s5xk\" (UniqueName: \"kubernetes.io/projected/1fca82f9-997b-4b27-a9bc-3b405a91f511-kube-api-access-8s5xk\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.033487 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aae5bab69602a4f8abf76bbff84fe4940a4c5f301c2e304abbd24b06a78b8c78" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.033539 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-qd9d2" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.314267 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lcgn5/crc-debug-2skz4"] Oct 03 15:26:36 crc kubenswrapper[4636]: E1003 15:26:36.314642 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fca82f9-997b-4b27-a9bc-3b405a91f511" containerName="container-00" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.314653 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fca82f9-997b-4b27-a9bc-3b405a91f511" containerName="container-00" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.314841 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fca82f9-997b-4b27-a9bc-3b405a91f511" containerName="container-00" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.315655 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-2skz4" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.389403 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj8wz\" (UniqueName: \"kubernetes.io/projected/b3a48cf5-20b3-4558-b970-60a0ef660d8f-kube-api-access-tj8wz\") pod \"crc-debug-2skz4\" (UID: \"b3a48cf5-20b3-4558-b970-60a0ef660d8f\") " pod="openshift-must-gather-lcgn5/crc-debug-2skz4" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.389732 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3a48cf5-20b3-4558-b970-60a0ef660d8f-host\") pod \"crc-debug-2skz4\" (UID: \"b3a48cf5-20b3-4558-b970-60a0ef660d8f\") " pod="openshift-must-gather-lcgn5/crc-debug-2skz4" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.491899 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj8wz\" (UniqueName: \"kubernetes.io/projected/b3a48cf5-20b3-4558-b970-60a0ef660d8f-kube-api-access-tj8wz\") pod \"crc-debug-2skz4\" (UID: \"b3a48cf5-20b3-4558-b970-60a0ef660d8f\") " pod="openshift-must-gather-lcgn5/crc-debug-2skz4" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.491986 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3a48cf5-20b3-4558-b970-60a0ef660d8f-host\") pod \"crc-debug-2skz4\" (UID: \"b3a48cf5-20b3-4558-b970-60a0ef660d8f\") " pod="openshift-must-gather-lcgn5/crc-debug-2skz4" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.492227 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3a48cf5-20b3-4558-b970-60a0ef660d8f-host\") pod \"crc-debug-2skz4\" (UID: \"b3a48cf5-20b3-4558-b970-60a0ef660d8f\") " pod="openshift-must-gather-lcgn5/crc-debug-2skz4" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.511862 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj8wz\" (UniqueName: \"kubernetes.io/projected/b3a48cf5-20b3-4558-b970-60a0ef660d8f-kube-api-access-tj8wz\") pod \"crc-debug-2skz4\" (UID: \"b3a48cf5-20b3-4558-b970-60a0ef660d8f\") " pod="openshift-must-gather-lcgn5/crc-debug-2skz4" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.634687 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-2skz4" Oct 03 15:26:36 crc kubenswrapper[4636]: I1003 15:26:36.803859 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fca82f9-997b-4b27-a9bc-3b405a91f511" path="/var/lib/kubelet/pods/1fca82f9-997b-4b27-a9bc-3b405a91f511/volumes" Oct 03 15:26:37 crc kubenswrapper[4636]: I1003 15:26:37.042642 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/crc-debug-2skz4" event={"ID":"b3a48cf5-20b3-4558-b970-60a0ef660d8f","Type":"ContainerStarted","Data":"42979e07d817b92eebb7cd74d1e37a497149b78ff91d6d1d1864b86ea2b731aa"} Oct 03 15:26:37 crc kubenswrapper[4636]: I1003 15:26:37.042695 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/crc-debug-2skz4" event={"ID":"b3a48cf5-20b3-4558-b970-60a0ef660d8f","Type":"ContainerStarted","Data":"c10c43401a1b62fc35223769e799cbebe1709932f8c91f0cd4d90cd011164e16"} Oct 03 15:26:37 crc kubenswrapper[4636]: I1003 15:26:37.058817 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lcgn5/crc-debug-2skz4" podStartSLOduration=1.058798698 podStartE2EDuration="1.058798698s" podCreationTimestamp="2025-10-03 15:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 15:26:37.057884924 +0000 UTC m=+5146.916611181" watchObservedRunningTime="2025-10-03 15:26:37.058798698 +0000 UTC m=+5146.917524945" Oct 03 15:26:37 crc kubenswrapper[4636]: I1003 15:26:37.832570 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fps9r"] Oct 03 15:26:37 crc kubenswrapper[4636]: I1003 15:26:37.835481 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:37 crc kubenswrapper[4636]: I1003 15:26:37.845117 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fps9r"] Oct 03 15:26:37 crc kubenswrapper[4636]: I1003 15:26:37.943856 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-catalog-content\") pod \"redhat-marketplace-fps9r\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:37 crc kubenswrapper[4636]: I1003 15:26:37.944054 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-utilities\") pod \"redhat-marketplace-fps9r\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:37 crc kubenswrapper[4636]: I1003 15:26:37.944126 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ppj2\" (UniqueName: \"kubernetes.io/projected/1dcccafa-73d9-4371-8135-bfa07be19ecf-kube-api-access-2ppj2\") pod \"redhat-marketplace-fps9r\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:38 crc kubenswrapper[4636]: I1003 15:26:38.046049 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-catalog-content\") pod \"redhat-marketplace-fps9r\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:38 crc kubenswrapper[4636]: I1003 15:26:38.046178 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-utilities\") pod \"redhat-marketplace-fps9r\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:38 crc kubenswrapper[4636]: I1003 15:26:38.046228 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ppj2\" (UniqueName: \"kubernetes.io/projected/1dcccafa-73d9-4371-8135-bfa07be19ecf-kube-api-access-2ppj2\") pod \"redhat-marketplace-fps9r\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:38 crc kubenswrapper[4636]: I1003 15:26:38.047001 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-utilities\") pod \"redhat-marketplace-fps9r\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:38 crc kubenswrapper[4636]: I1003 15:26:38.047020 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-catalog-content\") pod \"redhat-marketplace-fps9r\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:38 crc kubenswrapper[4636]: I1003 15:26:38.058655 4636 generic.go:334] "Generic (PLEG): container finished" podID="b3a48cf5-20b3-4558-b970-60a0ef660d8f" containerID="42979e07d817b92eebb7cd74d1e37a497149b78ff91d6d1d1864b86ea2b731aa" exitCode=0 Oct 03 15:26:38 crc kubenswrapper[4636]: I1003 15:26:38.058704 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/crc-debug-2skz4" event={"ID":"b3a48cf5-20b3-4558-b970-60a0ef660d8f","Type":"ContainerDied","Data":"42979e07d817b92eebb7cd74d1e37a497149b78ff91d6d1d1864b86ea2b731aa"} Oct 03 15:26:38 crc kubenswrapper[4636]: I1003 15:26:38.081838 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ppj2\" (UniqueName: \"kubernetes.io/projected/1dcccafa-73d9-4371-8135-bfa07be19ecf-kube-api-access-2ppj2\") pod \"redhat-marketplace-fps9r\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:38 crc kubenswrapper[4636]: I1003 15:26:38.161934 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:38 crc kubenswrapper[4636]: I1003 15:26:38.795041 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fps9r"] Oct 03 15:26:39 crc kubenswrapper[4636]: I1003 15:26:39.070378 4636 generic.go:334] "Generic (PLEG): container finished" podID="1dcccafa-73d9-4371-8135-bfa07be19ecf" containerID="0af1e82b52192b17e881440c726bbe24c2c623c6fdbead887dbd1893860d13f1" exitCode=0 Oct 03 15:26:39 crc kubenswrapper[4636]: I1003 15:26:39.070488 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fps9r" event={"ID":"1dcccafa-73d9-4371-8135-bfa07be19ecf","Type":"ContainerDied","Data":"0af1e82b52192b17e881440c726bbe24c2c623c6fdbead887dbd1893860d13f1"} Oct 03 15:26:39 crc kubenswrapper[4636]: I1003 15:26:39.071445 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fps9r" event={"ID":"1dcccafa-73d9-4371-8135-bfa07be19ecf","Type":"ContainerStarted","Data":"4721946e3bfc357932edf91f06f83d8f9d921b280ab76ba9c5f2462ee7da024a"} Oct 03 15:26:39 crc kubenswrapper[4636]: I1003 15:26:39.150047 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-2skz4" Oct 03 15:26:39 crc kubenswrapper[4636]: I1003 15:26:39.281716 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3a48cf5-20b3-4558-b970-60a0ef660d8f-host\") pod \"b3a48cf5-20b3-4558-b970-60a0ef660d8f\" (UID: \"b3a48cf5-20b3-4558-b970-60a0ef660d8f\") " Oct 03 15:26:39 crc kubenswrapper[4636]: I1003 15:26:39.281940 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj8wz\" (UniqueName: \"kubernetes.io/projected/b3a48cf5-20b3-4558-b970-60a0ef660d8f-kube-api-access-tj8wz\") pod \"b3a48cf5-20b3-4558-b970-60a0ef660d8f\" (UID: \"b3a48cf5-20b3-4558-b970-60a0ef660d8f\") " Oct 03 15:26:39 crc kubenswrapper[4636]: I1003 15:26:39.282900 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3a48cf5-20b3-4558-b970-60a0ef660d8f-host" (OuterVolumeSpecName: "host") pod "b3a48cf5-20b3-4558-b970-60a0ef660d8f" (UID: "b3a48cf5-20b3-4558-b970-60a0ef660d8f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:26:39 crc kubenswrapper[4636]: I1003 15:26:39.301566 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a48cf5-20b3-4558-b970-60a0ef660d8f-kube-api-access-tj8wz" (OuterVolumeSpecName: "kube-api-access-tj8wz") pod "b3a48cf5-20b3-4558-b970-60a0ef660d8f" (UID: "b3a48cf5-20b3-4558-b970-60a0ef660d8f"). InnerVolumeSpecName "kube-api-access-tj8wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:26:39 crc kubenswrapper[4636]: I1003 15:26:39.383551 4636 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3a48cf5-20b3-4558-b970-60a0ef660d8f-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:39 crc kubenswrapper[4636]: I1003 15:26:39.383590 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj8wz\" (UniqueName: \"kubernetes.io/projected/b3a48cf5-20b3-4558-b970-60a0ef660d8f-kube-api-access-tj8wz\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:40 crc kubenswrapper[4636]: I1003 15:26:40.080444 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-2skz4" Oct 03 15:26:40 crc kubenswrapper[4636]: I1003 15:26:40.080453 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/crc-debug-2skz4" event={"ID":"b3a48cf5-20b3-4558-b970-60a0ef660d8f","Type":"ContainerDied","Data":"c10c43401a1b62fc35223769e799cbebe1709932f8c91f0cd4d90cd011164e16"} Oct 03 15:26:40 crc kubenswrapper[4636]: I1003 15:26:40.081041 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c10c43401a1b62fc35223769e799cbebe1709932f8c91f0cd4d90cd011164e16" Oct 03 15:26:40 crc kubenswrapper[4636]: I1003 15:26:40.082724 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fps9r" event={"ID":"1dcccafa-73d9-4371-8135-bfa07be19ecf","Type":"ContainerStarted","Data":"e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2"} Oct 03 15:26:41 crc kubenswrapper[4636]: I1003 15:26:41.093196 4636 generic.go:334] "Generic (PLEG): container finished" podID="1dcccafa-73d9-4371-8135-bfa07be19ecf" containerID="e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2" exitCode=0 Oct 03 15:26:41 crc kubenswrapper[4636]: I1003 15:26:41.093231 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fps9r" event={"ID":"1dcccafa-73d9-4371-8135-bfa07be19ecf","Type":"ContainerDied","Data":"e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2"} Oct 03 15:26:41 crc kubenswrapper[4636]: I1003 15:26:41.793745 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:26:41 crc kubenswrapper[4636]: E1003 15:26:41.795247 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:26:42 crc kubenswrapper[4636]: I1003 15:26:42.109005 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fps9r" event={"ID":"1dcccafa-73d9-4371-8135-bfa07be19ecf","Type":"ContainerStarted","Data":"59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1"} Oct 03 15:26:42 crc kubenswrapper[4636]: I1003 15:26:42.135746 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fps9r" podStartSLOduration=2.521322868 podStartE2EDuration="5.135730248s" podCreationTimestamp="2025-10-03 15:26:37 +0000 UTC" firstStartedPulling="2025-10-03 15:26:39.074860447 +0000 UTC m=+5148.933586694" lastFinishedPulling="2025-10-03 15:26:41.689267837 +0000 UTC m=+5151.547994074" observedRunningTime="2025-10-03 15:26:42.129507234 +0000 UTC m=+5151.988233491" watchObservedRunningTime="2025-10-03 15:26:42.135730248 +0000 UTC m=+5151.994456485" Oct 03 15:26:45 crc kubenswrapper[4636]: I1003 15:26:45.359736 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lcgn5/crc-debug-2skz4"] Oct 03 15:26:45 crc kubenswrapper[4636]: I1003 15:26:45.368003 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lcgn5/crc-debug-2skz4"] Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.550739 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lcgn5/crc-debug-6zpg9"] Oct 03 15:26:46 crc kubenswrapper[4636]: E1003 15:26:46.551116 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a48cf5-20b3-4558-b970-60a0ef660d8f" containerName="container-00" Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.551127 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a48cf5-20b3-4558-b970-60a0ef660d8f" containerName="container-00" Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.551363 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a48cf5-20b3-4558-b970-60a0ef660d8f" containerName="container-00" Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.551936 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.617669 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwszd\" (UniqueName: \"kubernetes.io/projected/638f7716-4b4a-4e50-b1b2-e63200070c12-kube-api-access-gwszd\") pod \"crc-debug-6zpg9\" (UID: \"638f7716-4b4a-4e50-b1b2-e63200070c12\") " pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.617985 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/638f7716-4b4a-4e50-b1b2-e63200070c12-host\") pod \"crc-debug-6zpg9\" (UID: \"638f7716-4b4a-4e50-b1b2-e63200070c12\") " pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.719977 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/638f7716-4b4a-4e50-b1b2-e63200070c12-host\") pod \"crc-debug-6zpg9\" (UID: \"638f7716-4b4a-4e50-b1b2-e63200070c12\") " pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.720138 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/638f7716-4b4a-4e50-b1b2-e63200070c12-host\") pod \"crc-debug-6zpg9\" (UID: \"638f7716-4b4a-4e50-b1b2-e63200070c12\") " pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.720356 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwszd\" (UniqueName: \"kubernetes.io/projected/638f7716-4b4a-4e50-b1b2-e63200070c12-kube-api-access-gwszd\") pod \"crc-debug-6zpg9\" (UID: \"638f7716-4b4a-4e50-b1b2-e63200070c12\") " pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.747906 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwszd\" (UniqueName: \"kubernetes.io/projected/638f7716-4b4a-4e50-b1b2-e63200070c12-kube-api-access-gwszd\") pod \"crc-debug-6zpg9\" (UID: \"638f7716-4b4a-4e50-b1b2-e63200070c12\") " pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.806464 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3a48cf5-20b3-4558-b970-60a0ef660d8f" path="/var/lib/kubelet/pods/b3a48cf5-20b3-4558-b970-60a0ef660d8f/volumes" Oct 03 15:26:46 crc kubenswrapper[4636]: I1003 15:26:46.868377 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" Oct 03 15:26:47 crc kubenswrapper[4636]: I1003 15:26:47.146238 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" event={"ID":"638f7716-4b4a-4e50-b1b2-e63200070c12","Type":"ContainerStarted","Data":"616a0fa28e1dac1204e759babeed6990f90711d3d7f733a128d44d59e54ad66f"} Oct 03 15:26:48 crc kubenswrapper[4636]: I1003 15:26:48.155668 4636 generic.go:334] "Generic (PLEG): container finished" podID="638f7716-4b4a-4e50-b1b2-e63200070c12" containerID="38d5c8614252d588320ac30807aad50839ceeb636960bd75027c9161220a737b" exitCode=0 Oct 03 15:26:48 crc kubenswrapper[4636]: I1003 15:26:48.155711 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" event={"ID":"638f7716-4b4a-4e50-b1b2-e63200070c12","Type":"ContainerDied","Data":"38d5c8614252d588320ac30807aad50839ceeb636960bd75027c9161220a737b"} Oct 03 15:26:48 crc kubenswrapper[4636]: I1003 15:26:48.162986 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:48 crc kubenswrapper[4636]: I1003 15:26:48.163028 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:48 crc kubenswrapper[4636]: I1003 15:26:48.198637 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lcgn5/crc-debug-6zpg9"] Oct 03 15:26:48 crc kubenswrapper[4636]: I1003 15:26:48.209277 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lcgn5/crc-debug-6zpg9"] Oct 03 15:26:48 crc kubenswrapper[4636]: I1003 15:26:48.216519 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:49 crc kubenswrapper[4636]: I1003 15:26:49.275111 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:49 crc kubenswrapper[4636]: I1003 15:26:49.283848 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" Oct 03 15:26:49 crc kubenswrapper[4636]: I1003 15:26:49.337030 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fps9r"] Oct 03 15:26:49 crc kubenswrapper[4636]: I1003 15:26:49.376090 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/638f7716-4b4a-4e50-b1b2-e63200070c12-host\") pod \"638f7716-4b4a-4e50-b1b2-e63200070c12\" (UID: \"638f7716-4b4a-4e50-b1b2-e63200070c12\") " Oct 03 15:26:49 crc kubenswrapper[4636]: I1003 15:26:49.376210 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwszd\" (UniqueName: \"kubernetes.io/projected/638f7716-4b4a-4e50-b1b2-e63200070c12-kube-api-access-gwszd\") pod \"638f7716-4b4a-4e50-b1b2-e63200070c12\" (UID: \"638f7716-4b4a-4e50-b1b2-e63200070c12\") " Oct 03 15:26:49 crc kubenswrapper[4636]: I1003 15:26:49.376463 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/638f7716-4b4a-4e50-b1b2-e63200070c12-host" (OuterVolumeSpecName: "host") pod "638f7716-4b4a-4e50-b1b2-e63200070c12" (UID: "638f7716-4b4a-4e50-b1b2-e63200070c12"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 15:26:49 crc kubenswrapper[4636]: I1003 15:26:49.377083 4636 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/638f7716-4b4a-4e50-b1b2-e63200070c12-host\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:49 crc kubenswrapper[4636]: I1003 15:26:49.390753 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/638f7716-4b4a-4e50-b1b2-e63200070c12-kube-api-access-gwszd" (OuterVolumeSpecName: "kube-api-access-gwszd") pod "638f7716-4b4a-4e50-b1b2-e63200070c12" (UID: "638f7716-4b4a-4e50-b1b2-e63200070c12"). InnerVolumeSpecName "kube-api-access-gwszd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:26:49 crc kubenswrapper[4636]: I1003 15:26:49.478363 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwszd\" (UniqueName: \"kubernetes.io/projected/638f7716-4b4a-4e50-b1b2-e63200070c12-kube-api-access-gwszd\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:50 crc kubenswrapper[4636]: I1003 15:26:50.180238 4636 scope.go:117] "RemoveContainer" containerID="38d5c8614252d588320ac30807aad50839ceeb636960bd75027c9161220a737b" Oct 03 15:26:50 crc kubenswrapper[4636]: I1003 15:26:50.180368 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/crc-debug-6zpg9" Oct 03 15:26:50 crc kubenswrapper[4636]: I1003 15:26:50.427272 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/util/0.log" Oct 03 15:26:50 crc kubenswrapper[4636]: I1003 15:26:50.702016 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/util/0.log" Oct 03 15:26:50 crc kubenswrapper[4636]: I1003 15:26:50.710203 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/pull/0.log" Oct 03 15:26:50 crc kubenswrapper[4636]: I1003 15:26:50.737338 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/pull/0.log" Oct 03 15:26:50 crc kubenswrapper[4636]: I1003 15:26:50.807896 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="638f7716-4b4a-4e50-b1b2-e63200070c12" path="/var/lib/kubelet/pods/638f7716-4b4a-4e50-b1b2-e63200070c12/volumes" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.001135 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/extract/0.log" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.009695 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/pull/0.log" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.018820 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3595cfbe99a0467771e9c583484026311e1b8fa491f497756b76b2d3d8hthvb_b1a4227a-2f56-4bdd-b347-9d8df4ed42e8/util/0.log" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.190584 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fps9r" podUID="1dcccafa-73d9-4371-8135-bfa07be19ecf" containerName="registry-server" containerID="cri-o://59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1" gracePeriod=2 Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.220322 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-fh8lr_12b01d5f-b89d-4bf4-bd46-387f2a7ab48f/kube-rbac-proxy/0.log" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.272564 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-fh8lr_12b01d5f-b89d-4bf4-bd46-387f2a7ab48f/manager/0.log" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.371692 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-z87w6_c8d803e5-9eca-49bf-976a-2acdfc25a727/kube-rbac-proxy/0.log" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.628660 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-z87w6_c8d803e5-9eca-49bf-976a-2acdfc25a727/manager/0.log" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.634226 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-x9xms_8002528c-8119-4119-923c-1e15162e63f3/kube-rbac-proxy/0.log" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.676888 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.696406 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-x9xms_8002528c-8119-4119-923c-1e15162e63f3/manager/0.log" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.821300 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-utilities\") pod \"1dcccafa-73d9-4371-8135-bfa07be19ecf\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.821421 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-catalog-content\") pod \"1dcccafa-73d9-4371-8135-bfa07be19ecf\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.821606 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ppj2\" (UniqueName: \"kubernetes.io/projected/1dcccafa-73d9-4371-8135-bfa07be19ecf-kube-api-access-2ppj2\") pod \"1dcccafa-73d9-4371-8135-bfa07be19ecf\" (UID: \"1dcccafa-73d9-4371-8135-bfa07be19ecf\") " Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.822198 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-utilities" (OuterVolumeSpecName: "utilities") pod "1dcccafa-73d9-4371-8135-bfa07be19ecf" (UID: "1dcccafa-73d9-4371-8135-bfa07be19ecf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.829390 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dcccafa-73d9-4371-8135-bfa07be19ecf-kube-api-access-2ppj2" (OuterVolumeSpecName: "kube-api-access-2ppj2") pod "1dcccafa-73d9-4371-8135-bfa07be19ecf" (UID: "1dcccafa-73d9-4371-8135-bfa07be19ecf"). InnerVolumeSpecName "kube-api-access-2ppj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.837720 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1dcccafa-73d9-4371-8135-bfa07be19ecf" (UID: "1dcccafa-73d9-4371-8135-bfa07be19ecf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.923224 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.923257 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcccafa-73d9-4371-8135-bfa07be19ecf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.923271 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ppj2\" (UniqueName: \"kubernetes.io/projected/1dcccafa-73d9-4371-8135-bfa07be19ecf-kube-api-access-2ppj2\") on node \"crc\" DevicePath \"\"" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.937395 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-g8m75_3c207da6-bfc7-4287-aa67-56c0097f48f3/kube-rbac-proxy/0.log" Oct 03 15:26:51 crc kubenswrapper[4636]: I1003 15:26:51.989384 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-g8m75_3c207da6-bfc7-4287-aa67-56c0097f48f3/manager/0.log" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.102403 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-nshvn_eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5/kube-rbac-proxy/0.log" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.152064 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-nshvn_eaba1b01-dfa6-48e4-b4f3-70a67fbfa8b5/manager/0.log" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.201608 4636 generic.go:334] "Generic (PLEG): container finished" podID="1dcccafa-73d9-4371-8135-bfa07be19ecf" containerID="59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1" exitCode=0 Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.201648 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fps9r" event={"ID":"1dcccafa-73d9-4371-8135-bfa07be19ecf","Type":"ContainerDied","Data":"59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1"} Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.201675 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fps9r" event={"ID":"1dcccafa-73d9-4371-8135-bfa07be19ecf","Type":"ContainerDied","Data":"4721946e3bfc357932edf91f06f83d8f9d921b280ab76ba9c5f2462ee7da024a"} Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.201692 4636 scope.go:117] "RemoveContainer" containerID="59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.201688 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fps9r" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.227744 4636 scope.go:117] "RemoveContainer" containerID="e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.249861 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fps9r"] Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.257887 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fps9r"] Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.276172 4636 scope.go:117] "RemoveContainer" containerID="0af1e82b52192b17e881440c726bbe24c2c623c6fdbead887dbd1893860d13f1" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.341224 4636 scope.go:117] "RemoveContainer" containerID="59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1" Oct 03 15:26:52 crc kubenswrapper[4636]: E1003 15:26:52.342159 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1\": container with ID starting with 59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1 not found: ID does not exist" containerID="59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.342194 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1"} err="failed to get container status \"59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1\": rpc error: code = NotFound desc = could not find container \"59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1\": container with ID starting with 59387a37f9ae896846ad4f968b37e8c4d00e23abe352fc53e5b70998d5e713b1 not found: ID does not exist" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.342217 4636 scope.go:117] "RemoveContainer" containerID="e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2" Oct 03 15:26:52 crc kubenswrapper[4636]: E1003 15:26:52.342444 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2\": container with ID starting with e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2 not found: ID does not exist" containerID="e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.342463 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2"} err="failed to get container status \"e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2\": rpc error: code = NotFound desc = could not find container \"e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2\": container with ID starting with e8ebc92084836ad936ea98015641452a4f223d4874a8dc120427cf3c2f6f5dc2 not found: ID does not exist" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.342476 4636 scope.go:117] "RemoveContainer" containerID="0af1e82b52192b17e881440c726bbe24c2c623c6fdbead887dbd1893860d13f1" Oct 03 15:26:52 crc kubenswrapper[4636]: E1003 15:26:52.342699 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af1e82b52192b17e881440c726bbe24c2c623c6fdbead887dbd1893860d13f1\": container with ID starting with 0af1e82b52192b17e881440c726bbe24c2c623c6fdbead887dbd1893860d13f1 not found: ID does not exist" containerID="0af1e82b52192b17e881440c726bbe24c2c623c6fdbead887dbd1893860d13f1" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.342733 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af1e82b52192b17e881440c726bbe24c2c623c6fdbead887dbd1893860d13f1"} err="failed to get container status \"0af1e82b52192b17e881440c726bbe24c2c623c6fdbead887dbd1893860d13f1\": rpc error: code = NotFound desc = could not find container \"0af1e82b52192b17e881440c726bbe24c2c623c6fdbead887dbd1893860d13f1\": container with ID starting with 0af1e82b52192b17e881440c726bbe24c2c623c6fdbead887dbd1893860d13f1 not found: ID does not exist" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.385951 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-dwvpt_24c6a469-5b37-4dc9-baed-6a3c54b11861/kube-rbac-proxy/0.log" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.413791 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-dwvpt_24c6a469-5b37-4dc9-baed-6a3c54b11861/manager/0.log" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.553810 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-6h5gc_6e000db3-2d29-4608-9a70-cfe88094a950/kube-rbac-proxy/0.log" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.688816 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-p7d6m_d9a0c033-eaea-4336-96e6-9664f726e50e/kube-rbac-proxy/0.log" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.751982 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-6h5gc_6e000db3-2d29-4608-9a70-cfe88094a950/manager/0.log" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.803004 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dcccafa-73d9-4371-8135-bfa07be19ecf" path="/var/lib/kubelet/pods/1dcccafa-73d9-4371-8135-bfa07be19ecf/volumes" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.914213 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-p7d6m_d9a0c033-eaea-4336-96e6-9664f726e50e/manager/0.log" Oct 03 15:26:52 crc kubenswrapper[4636]: I1003 15:26:52.962794 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-jcqmk_62436a9b-229c-486b-a715-6787e100d19b/kube-rbac-proxy/0.log" Oct 03 15:26:53 crc kubenswrapper[4636]: I1003 15:26:53.132746 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-jcqmk_62436a9b-229c-486b-a715-6787e100d19b/manager/0.log" Oct 03 15:26:53 crc kubenswrapper[4636]: I1003 15:26:53.516737 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-8knqr_dff12a21-eff6-45da-bf37-d3f0620f9c05/kube-rbac-proxy/0.log" Oct 03 15:26:53 crc kubenswrapper[4636]: I1003 15:26:53.594354 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-8knqr_dff12a21-eff6-45da-bf37-d3f0620f9c05/manager/0.log" Oct 03 15:26:53 crc kubenswrapper[4636]: I1003 15:26:53.657685 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj_8563d341-44cb-43b4-b7a8-ba3beeac60ea/kube-rbac-proxy/0.log" Oct 03 15:26:53 crc kubenswrapper[4636]: I1003 15:26:53.773077 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-2tjsj_8563d341-44cb-43b4-b7a8-ba3beeac60ea/manager/0.log" Oct 03 15:26:53 crc kubenswrapper[4636]: I1003 15:26:53.845147 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-9dc5p_89e06d08-9381-4aff-ba52-682080bd03bb/kube-rbac-proxy/0.log" Oct 03 15:26:53 crc kubenswrapper[4636]: I1003 15:26:53.930638 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-9dc5p_89e06d08-9381-4aff-ba52-682080bd03bb/manager/0.log" Oct 03 15:26:54 crc kubenswrapper[4636]: I1003 15:26:54.073682 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-cbxrx_60ec0b38-a07e-46e2-bc94-1af33d301eb6/kube-rbac-proxy/0.log" Oct 03 15:26:54 crc kubenswrapper[4636]: I1003 15:26:54.244883 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-cbxrx_60ec0b38-a07e-46e2-bc94-1af33d301eb6/manager/0.log" Oct 03 15:26:54 crc kubenswrapper[4636]: I1003 15:26:54.369944 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-48wqb_2f612d08-a478-46d7-aefd-f31051af25d9/kube-rbac-proxy/0.log" Oct 03 15:26:54 crc kubenswrapper[4636]: I1003 15:26:54.395138 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-48wqb_2f612d08-a478-46d7-aefd-f31051af25d9/manager/0.log" Oct 03 15:26:54 crc kubenswrapper[4636]: I1003 15:26:54.472814 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj_314cbc97-254d-4e64-a06f-68c7b0488c46/kube-rbac-proxy/0.log" Oct 03 15:26:54 crc kubenswrapper[4636]: I1003 15:26:54.600721 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c9mvgj_314cbc97-254d-4e64-a06f-68c7b0488c46/manager/0.log" Oct 03 15:26:54 crc kubenswrapper[4636]: I1003 15:26:54.967589 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-66d65dc5dc-ljjsx_c266feaf-9983-414a-b65e-5a13fc55c419/kube-rbac-proxy/0.log" Oct 03 15:26:54 crc kubenswrapper[4636]: I1003 15:26:54.970662 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6dfbbfcbb4-flhg6_a119c810-cd24-4c51-a23b-88776132f825/kube-rbac-proxy/0.log" Oct 03 15:26:55 crc kubenswrapper[4636]: I1003 15:26:55.304784 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-bsvgc_8721857c-625f-4884-bb46-55f9ce071491/registry-server/0.log" Oct 03 15:26:55 crc kubenswrapper[4636]: I1003 15:26:55.330729 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-66d65dc5dc-ljjsx_c266feaf-9983-414a-b65e-5a13fc55c419/operator/0.log" Oct 03 15:26:55 crc kubenswrapper[4636]: I1003 15:26:55.572278 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-8nj2c_f8f9f506-672a-4f93-8645-f0cd608feed0/kube-rbac-proxy/0.log" Oct 03 15:26:55 crc kubenswrapper[4636]: I1003 15:26:55.699670 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-lkd7z_ad1290bf-25e9-4766-8398-ff4811e65cad/kube-rbac-proxy/0.log" Oct 03 15:26:55 crc kubenswrapper[4636]: I1003 15:26:55.736731 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-8nj2c_f8f9f506-672a-4f93-8645-f0cd608feed0/manager/0.log" Oct 03 15:26:55 crc kubenswrapper[4636]: I1003 15:26:55.854588 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-lkd7z_ad1290bf-25e9-4766-8398-ff4811e65cad/manager/0.log" Oct 03 15:26:55 crc kubenswrapper[4636]: I1003 15:26:55.957500 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-kmz9b_80c4c4f6-4616-48a9-98a7-f38ebdc58514/operator/0.log" Oct 03 15:26:55 crc kubenswrapper[4636]: I1003 15:26:55.999602 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6dfbbfcbb4-flhg6_a119c810-cd24-4c51-a23b-88776132f825/manager/0.log" Oct 03 15:26:56 crc kubenswrapper[4636]: I1003 15:26:56.167295 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-7qrjk_b3cb07c2-c2b9-4421-baba-ede1bed11656/kube-rbac-proxy/0.log" Oct 03 15:26:56 crc kubenswrapper[4636]: I1003 15:26:56.188267 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-7qrjk_b3cb07c2-c2b9-4421-baba-ede1bed11656/manager/0.log" Oct 03 15:26:56 crc kubenswrapper[4636]: I1003 15:26:56.266047 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-8qdtd_126025f8-40af-4a27-a9cc-8ece19d269b0/kube-rbac-proxy/0.log" Oct 03 15:26:56 crc kubenswrapper[4636]: I1003 15:26:56.305916 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-8qdtd_126025f8-40af-4a27-a9cc-8ece19d269b0/manager/0.log" Oct 03 15:26:56 crc kubenswrapper[4636]: I1003 15:26:56.432050 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-lc5hb_58d5890d-301f-43e9-b627-40f17f79da7f/kube-rbac-proxy/0.log" Oct 03 15:26:56 crc kubenswrapper[4636]: I1003 15:26:56.494963 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-lc5hb_58d5890d-301f-43e9-b627-40f17f79da7f/manager/0.log" Oct 03 15:26:56 crc kubenswrapper[4636]: I1003 15:26:56.594511 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-xj7dp_24b72852-6d98-4011-9643-5079fa6f8076/kube-rbac-proxy/0.log" Oct 03 15:26:56 crc kubenswrapper[4636]: I1003 15:26:56.658619 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-xj7dp_24b72852-6d98-4011-9643-5079fa6f8076/manager/0.log" Oct 03 15:26:56 crc kubenswrapper[4636]: I1003 15:26:56.794631 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:26:56 crc kubenswrapper[4636]: E1003 15:26:56.794889 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:27:09 crc kubenswrapper[4636]: I1003 15:27:09.794802 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:27:09 crc kubenswrapper[4636]: E1003 15:27:09.795688 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:27:11 crc kubenswrapper[4636]: I1003 15:27:11.722015 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-r49hv_0499c819-4b67-4882-9354-f7b9d6d2adc7/control-plane-machine-set-operator/0.log" Oct 03 15:27:11 crc kubenswrapper[4636]: I1003 15:27:11.908397 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qzkgg_e697897f-0594-48da-967d-e429421b8fec/kube-rbac-proxy/0.log" Oct 03 15:27:11 crc kubenswrapper[4636]: I1003 15:27:11.923215 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qzkgg_e697897f-0594-48da-967d-e429421b8fec/machine-api-operator/0.log" Oct 03 15:27:24 crc kubenswrapper[4636]: I1003 15:27:24.322821 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-tswd6_be83bffc-d4e8-469a-85d9-6cc8ec6b64f4/cert-manager-controller/0.log" Oct 03 15:27:24 crc kubenswrapper[4636]: I1003 15:27:24.374252 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-lzr2w_2974bed1-bc60-45f9-a4ce-42f14db27998/cert-manager-cainjector/0.log" Oct 03 15:27:24 crc kubenswrapper[4636]: I1003 15:27:24.454827 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-jw6vl_d933c0ac-7ab5-4b2f-9602-5b277d92679e/cert-manager-webhook/0.log" Oct 03 15:27:24 crc kubenswrapper[4636]: I1003 15:27:24.794327 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:27:24 crc kubenswrapper[4636]: E1003 15:27:24.794571 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:27:35 crc kubenswrapper[4636]: I1003 15:27:35.614526 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-zztw8_7bc7eb6e-0aa6-44a5-914e-7f3a97421f50/nmstate-console-plugin/0.log" Oct 03 15:27:35 crc kubenswrapper[4636]: I1003 15:27:35.794232 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:27:35 crc kubenswrapper[4636]: E1003 15:27:35.794832 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:27:35 crc kubenswrapper[4636]: I1003 15:27:35.827221 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mtj6z_89380ab9-db32-4562-aec2-69a9f3c703b6/nmstate-handler/0.log" Oct 03 15:27:35 crc kubenswrapper[4636]: I1003 15:27:35.860603 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vbc2m_438131cc-c24c-40a2-b874-8d1dca095f61/kube-rbac-proxy/0.log" Oct 03 15:27:35 crc kubenswrapper[4636]: I1003 15:27:35.973143 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vbc2m_438131cc-c24c-40a2-b874-8d1dca095f61/nmstate-metrics/0.log" Oct 03 15:27:36 crc kubenswrapper[4636]: I1003 15:27:36.074946 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-7zh6n_a043488f-1ceb-4faa-a72a-76172cf550f7/nmstate-operator/0.log" Oct 03 15:27:36 crc kubenswrapper[4636]: I1003 15:27:36.199711 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-46ppb_dc21bb9c-24c3-4267-ab8d-96ed8e255c69/nmstate-webhook/0.log" Oct 03 15:27:49 crc kubenswrapper[4636]: I1003 15:27:49.283345 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-87w8j_f5c8bfd9-03d0-45ec-825a-d0c8f613c29c/kube-rbac-proxy/0.log" Oct 03 15:27:49 crc kubenswrapper[4636]: I1003 15:27:49.539895 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-frr-files/0.log" Oct 03 15:27:49 crc kubenswrapper[4636]: I1003 15:27:49.662119 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-frr-files/0.log" Oct 03 15:27:49 crc kubenswrapper[4636]: I1003 15:27:49.695910 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-reloader/0.log" Oct 03 15:27:49 crc kubenswrapper[4636]: I1003 15:27:49.765558 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-metrics/0.log" Oct 03 15:27:49 crc kubenswrapper[4636]: I1003 15:27:49.888865 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-reloader/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.018849 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-87w8j_f5c8bfd9-03d0-45ec-825a-d0c8f613c29c/controller/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.061143 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-frr-files/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.064658 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-reloader/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.153014 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-metrics/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.216163 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-metrics/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.385850 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-frr-files/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.455713 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-reloader/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.458448 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/controller/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.463011 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/cp-metrics/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.657147 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/kube-rbac-proxy/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.673530 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/frr-metrics/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.773478 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/kube-rbac-proxy-frr/0.log" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.800507 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:27:50 crc kubenswrapper[4636]: E1003 15:27:50.800774 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:27:50 crc kubenswrapper[4636]: I1003 15:27:50.918650 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/reloader/0.log" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.058924 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-ttj4x_360e3dae-23f1-4ddd-9815-d6a41e611501/frr-k8s-webhook-server/0.log" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.203429 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79b89cf995-qtfsw_4d75cbbf-e22d-49aa-ae40-c77a69421e1a/manager/0.log" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.419538 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d746fccb7-rtxlz_fdeca3bd-7bca-4463-b480-1b94361da961/webhook-server/0.log" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.688765 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ggz7j_39a6b95f-24cf-4365-93c0-b47b7a7672fb/kube-rbac-proxy/0.log" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.775122 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9p5n9"] Oct 03 15:27:51 crc kubenswrapper[4636]: E1003 15:27:51.775644 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcccafa-73d9-4371-8135-bfa07be19ecf" containerName="extract-utilities" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.775662 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcccafa-73d9-4371-8135-bfa07be19ecf" containerName="extract-utilities" Oct 03 15:27:51 crc kubenswrapper[4636]: E1003 15:27:51.775688 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="638f7716-4b4a-4e50-b1b2-e63200070c12" containerName="container-00" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.775695 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="638f7716-4b4a-4e50-b1b2-e63200070c12" containerName="container-00" Oct 03 15:27:51 crc kubenswrapper[4636]: E1003 15:27:51.775718 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcccafa-73d9-4371-8135-bfa07be19ecf" containerName="extract-content" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.775725 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcccafa-73d9-4371-8135-bfa07be19ecf" containerName="extract-content" Oct 03 15:27:51 crc kubenswrapper[4636]: E1003 15:27:51.775734 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcccafa-73d9-4371-8135-bfa07be19ecf" containerName="registry-server" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.775761 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcccafa-73d9-4371-8135-bfa07be19ecf" containerName="registry-server" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.775965 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="638f7716-4b4a-4e50-b1b2-e63200070c12" containerName="container-00" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.775987 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dcccafa-73d9-4371-8135-bfa07be19ecf" containerName="registry-server" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.777552 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.787261 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9p5n9"] Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.882236 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-utilities\") pod \"redhat-operators-9p5n9\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.882329 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld6qh\" (UniqueName: \"kubernetes.io/projected/56346325-a21a-44bd-8f6d-01f18a6ddb39-kube-api-access-ld6qh\") pod \"redhat-operators-9p5n9\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.882375 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-catalog-content\") pod \"redhat-operators-9p5n9\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.983284 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld6qh\" (UniqueName: \"kubernetes.io/projected/56346325-a21a-44bd-8f6d-01f18a6ddb39-kube-api-access-ld6qh\") pod \"redhat-operators-9p5n9\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.983524 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-catalog-content\") pod \"redhat-operators-9p5n9\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.983653 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-utilities\") pod \"redhat-operators-9p5n9\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.984356 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-catalog-content\") pod \"redhat-operators-9p5n9\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:27:51 crc kubenswrapper[4636]: I1003 15:27:51.984371 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-utilities\") pod \"redhat-operators-9p5n9\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:27:52 crc kubenswrapper[4636]: I1003 15:27:52.004530 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld6qh\" (UniqueName: \"kubernetes.io/projected/56346325-a21a-44bd-8f6d-01f18a6ddb39-kube-api-access-ld6qh\") pod \"redhat-operators-9p5n9\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:27:52 crc kubenswrapper[4636]: I1003 15:27:52.113122 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:27:52 crc kubenswrapper[4636]: I1003 15:27:52.278402 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ggz7j_39a6b95f-24cf-4365-93c0-b47b7a7672fb/speaker/0.log" Oct 03 15:27:52 crc kubenswrapper[4636]: I1003 15:27:52.315614 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tzvhr_27812b9a-f947-40fd-a74b-f10fa236e965/frr/0.log" Oct 03 15:27:52 crc kubenswrapper[4636]: I1003 15:27:52.699667 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9p5n9"] Oct 03 15:27:52 crc kubenswrapper[4636]: W1003 15:27:52.702787 4636 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56346325_a21a_44bd_8f6d_01f18a6ddb39.slice/crio-3d129a14af47fea7965232e98cd0f7ef7bc9fc84a67cc9e25895c7867f0bab40 WatchSource:0}: Error finding container 3d129a14af47fea7965232e98cd0f7ef7bc9fc84a67cc9e25895c7867f0bab40: Status 404 returned error can't find the container with id 3d129a14af47fea7965232e98cd0f7ef7bc9fc84a67cc9e25895c7867f0bab40 Oct 03 15:27:52 crc kubenswrapper[4636]: I1003 15:27:52.729482 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p5n9" event={"ID":"56346325-a21a-44bd-8f6d-01f18a6ddb39","Type":"ContainerStarted","Data":"3d129a14af47fea7965232e98cd0f7ef7bc9fc84a67cc9e25895c7867f0bab40"} Oct 03 15:27:53 crc kubenswrapper[4636]: I1003 15:27:53.738770 4636 generic.go:334] "Generic (PLEG): container finished" podID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerID="fdcb9953610d1217242d3b31dfd5cc37fa906ee9839385f14e421d10f685d773" exitCode=0 Oct 03 15:27:53 crc kubenswrapper[4636]: I1003 15:27:53.739045 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p5n9" event={"ID":"56346325-a21a-44bd-8f6d-01f18a6ddb39","Type":"ContainerDied","Data":"fdcb9953610d1217242d3b31dfd5cc37fa906ee9839385f14e421d10f685d773"} Oct 03 15:27:53 crc kubenswrapper[4636]: I1003 15:27:53.741508 4636 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 15:27:55 crc kubenswrapper[4636]: I1003 15:27:55.768465 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p5n9" event={"ID":"56346325-a21a-44bd-8f6d-01f18a6ddb39","Type":"ContainerStarted","Data":"900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941"} Oct 03 15:28:01 crc kubenswrapper[4636]: I1003 15:28:01.793970 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:28:01 crc kubenswrapper[4636]: E1003 15:28:01.794760 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:28:01 crc kubenswrapper[4636]: I1003 15:28:01.823403 4636 generic.go:334] "Generic (PLEG): container finished" podID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerID="900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941" exitCode=0 Oct 03 15:28:01 crc kubenswrapper[4636]: I1003 15:28:01.823446 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p5n9" event={"ID":"56346325-a21a-44bd-8f6d-01f18a6ddb39","Type":"ContainerDied","Data":"900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941"} Oct 03 15:28:02 crc kubenswrapper[4636]: I1003 15:28:02.834250 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p5n9" event={"ID":"56346325-a21a-44bd-8f6d-01f18a6ddb39","Type":"ContainerStarted","Data":"973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05"} Oct 03 15:28:02 crc kubenswrapper[4636]: I1003 15:28:02.855060 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9p5n9" podStartSLOduration=3.097402139 podStartE2EDuration="11.855039156s" podCreationTimestamp="2025-10-03 15:27:51 +0000 UTC" firstStartedPulling="2025-10-03 15:27:53.741315518 +0000 UTC m=+5223.600041765" lastFinishedPulling="2025-10-03 15:28:02.498952545 +0000 UTC m=+5232.357678782" observedRunningTime="2025-10-03 15:28:02.849861309 +0000 UTC m=+5232.708587556" watchObservedRunningTime="2025-10-03 15:28:02.855039156 +0000 UTC m=+5232.713765403" Oct 03 15:28:04 crc kubenswrapper[4636]: I1003 15:28:04.270031 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/util/0.log" Oct 03 15:28:04 crc kubenswrapper[4636]: I1003 15:28:04.536053 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/pull/0.log" Oct 03 15:28:04 crc kubenswrapper[4636]: I1003 15:28:04.539834 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/pull/0.log" Oct 03 15:28:04 crc kubenswrapper[4636]: I1003 15:28:04.570501 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/util/0.log" Oct 03 15:28:04 crc kubenswrapper[4636]: I1003 15:28:04.715066 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/util/0.log" Oct 03 15:28:04 crc kubenswrapper[4636]: I1003 15:28:04.790952 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/extract/0.log" Oct 03 15:28:04 crc kubenswrapper[4636]: I1003 15:28:04.816703 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hpdwj_a83000c5-7baa-4587-980d-90391869a32c/pull/0.log" Oct 03 15:28:04 crc kubenswrapper[4636]: I1003 15:28:04.969785 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-utilities/0.log" Oct 03 15:28:05 crc kubenswrapper[4636]: I1003 15:28:05.191034 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-content/0.log" Oct 03 15:28:05 crc kubenswrapper[4636]: I1003 15:28:05.209266 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-utilities/0.log" Oct 03 15:28:05 crc kubenswrapper[4636]: I1003 15:28:05.211954 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-content/0.log" Oct 03 15:28:05 crc kubenswrapper[4636]: I1003 15:28:05.413083 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-content/0.log" Oct 03 15:28:05 crc kubenswrapper[4636]: I1003 15:28:05.413720 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/extract-utilities/0.log" Oct 03 15:28:05 crc kubenswrapper[4636]: I1003 15:28:05.691294 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-utilities/0.log" Oct 03 15:28:06 crc kubenswrapper[4636]: I1003 15:28:06.081232 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cx5c_955b6210-120c-4407-a1b0-2565f8407a8f/registry-server/0.log" Oct 03 15:28:06 crc kubenswrapper[4636]: I1003 15:28:06.120887 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-utilities/0.log" Oct 03 15:28:06 crc kubenswrapper[4636]: I1003 15:28:06.181038 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-content/0.log" Oct 03 15:28:06 crc kubenswrapper[4636]: I1003 15:28:06.197767 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-content/0.log" Oct 03 15:28:06 crc kubenswrapper[4636]: I1003 15:28:06.513088 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-utilities/0.log" Oct 03 15:28:06 crc kubenswrapper[4636]: I1003 15:28:06.548870 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/extract-content/0.log" Oct 03 15:28:06 crc kubenswrapper[4636]: I1003 15:28:06.829620 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/util/0.log" Oct 03 15:28:07 crc kubenswrapper[4636]: I1003 15:28:07.249672 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p7wgr_6f8d0287-e1cd-461f-917e-febaa7ac576e/registry-server/0.log" Oct 03 15:28:07 crc kubenswrapper[4636]: I1003 15:28:07.435281 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/pull/0.log" Oct 03 15:28:07 crc kubenswrapper[4636]: I1003 15:28:07.465341 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/util/0.log" Oct 03 15:28:07 crc kubenswrapper[4636]: I1003 15:28:07.499226 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/pull/0.log" Oct 03 15:28:07 crc kubenswrapper[4636]: I1003 15:28:07.782144 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/pull/0.log" Oct 03 15:28:07 crc kubenswrapper[4636]: I1003 15:28:07.816128 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/extract/0.log" Oct 03 15:28:07 crc kubenswrapper[4636]: I1003 15:28:07.819206 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cfvpdr_b557fea4-b20c-4f54-88af-89ed7d755cda/util/0.log" Oct 03 15:28:08 crc kubenswrapper[4636]: I1003 15:28:08.051265 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ncqfg_eb4639ab-5b3c-4f36-9c1e-077930e571e3/marketplace-operator/0.log" Oct 03 15:28:08 crc kubenswrapper[4636]: I1003 15:28:08.153158 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-utilities/0.log" Oct 03 15:28:08 crc kubenswrapper[4636]: I1003 15:28:08.352785 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-content/0.log" Oct 03 15:28:08 crc kubenswrapper[4636]: I1003 15:28:08.404028 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-content/0.log" Oct 03 15:28:08 crc kubenswrapper[4636]: I1003 15:28:08.404215 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-utilities/0.log" Oct 03 15:28:08 crc kubenswrapper[4636]: I1003 15:28:08.669609 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-utilities/0.log" Oct 03 15:28:08 crc kubenswrapper[4636]: I1003 15:28:08.706258 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9p5n9_56346325-a21a-44bd-8f6d-01f18a6ddb39/extract-utilities/0.log" Oct 03 15:28:08 crc kubenswrapper[4636]: I1003 15:28:08.719153 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/extract-content/0.log" Oct 03 15:28:08 crc kubenswrapper[4636]: I1003 15:28:08.910059 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k8gjq_8df150bb-9cae-4839-bc31-0211d3610788/registry-server/0.log" Oct 03 15:28:08 crc kubenswrapper[4636]: I1003 15:28:08.999806 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9p5n9_56346325-a21a-44bd-8f6d-01f18a6ddb39/extract-content/0.log" Oct 03 15:28:09 crc kubenswrapper[4636]: I1003 15:28:09.039242 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9p5n9_56346325-a21a-44bd-8f6d-01f18a6ddb39/extract-content/0.log" Oct 03 15:28:09 crc kubenswrapper[4636]: I1003 15:28:09.048496 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9p5n9_56346325-a21a-44bd-8f6d-01f18a6ddb39/extract-utilities/0.log" Oct 03 15:28:09 crc kubenswrapper[4636]: I1003 15:28:09.201550 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9p5n9_56346325-a21a-44bd-8f6d-01f18a6ddb39/extract-utilities/0.log" Oct 03 15:28:09 crc kubenswrapper[4636]: I1003 15:28:09.250892 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9p5n9_56346325-a21a-44bd-8f6d-01f18a6ddb39/extract-content/0.log" Oct 03 15:28:09 crc kubenswrapper[4636]: I1003 15:28:09.300566 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9p5n9_56346325-a21a-44bd-8f6d-01f18a6ddb39/registry-server/0.log" Oct 03 15:28:09 crc kubenswrapper[4636]: I1003 15:28:09.311780 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-utilities/0.log" Oct 03 15:28:09 crc kubenswrapper[4636]: I1003 15:28:09.505630 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-content/0.log" Oct 03 15:28:09 crc kubenswrapper[4636]: I1003 15:28:09.507594 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-content/0.log" Oct 03 15:28:09 crc kubenswrapper[4636]: I1003 15:28:09.537063 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-utilities/0.log" Oct 03 15:28:09 crc kubenswrapper[4636]: I1003 15:28:09.754388 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-utilities/0.log" Oct 03 15:28:09 crc kubenswrapper[4636]: I1003 15:28:09.781125 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/extract-content/0.log" Oct 03 15:28:10 crc kubenswrapper[4636]: I1003 15:28:10.236007 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s4hpr_7677fae0-2c20-47c0-aae2-52657add9d92/registry-server/0.log" Oct 03 15:28:12 crc kubenswrapper[4636]: I1003 15:28:12.114520 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:28:12 crc kubenswrapper[4636]: I1003 15:28:12.114815 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:28:12 crc kubenswrapper[4636]: I1003 15:28:12.793700 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:28:12 crc kubenswrapper[4636]: E1003 15:28:12.794002 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:28:13 crc kubenswrapper[4636]: I1003 15:28:13.159031 4636 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9p5n9" podUID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerName="registry-server" probeResult="failure" output=< Oct 03 15:28:13 crc kubenswrapper[4636]: timeout: failed to connect service ":50051" within 1s Oct 03 15:28:13 crc kubenswrapper[4636]: > Oct 03 15:28:22 crc kubenswrapper[4636]: I1003 15:28:22.163729 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:28:22 crc kubenswrapper[4636]: I1003 15:28:22.221082 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:28:22 crc kubenswrapper[4636]: I1003 15:28:22.975218 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9p5n9"] Oct 03 15:28:24 crc kubenswrapper[4636]: I1003 15:28:24.047087 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9p5n9" podUID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerName="registry-server" containerID="cri-o://973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05" gracePeriod=2 Oct 03 15:28:24 crc kubenswrapper[4636]: I1003 15:28:24.498308 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:28:24 crc kubenswrapper[4636]: I1003 15:28:24.588336 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-catalog-content\") pod \"56346325-a21a-44bd-8f6d-01f18a6ddb39\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " Oct 03 15:28:24 crc kubenswrapper[4636]: I1003 15:28:24.588640 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld6qh\" (UniqueName: \"kubernetes.io/projected/56346325-a21a-44bd-8f6d-01f18a6ddb39-kube-api-access-ld6qh\") pod \"56346325-a21a-44bd-8f6d-01f18a6ddb39\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " Oct 03 15:28:24 crc kubenswrapper[4636]: I1003 15:28:24.588736 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-utilities\") pod \"56346325-a21a-44bd-8f6d-01f18a6ddb39\" (UID: \"56346325-a21a-44bd-8f6d-01f18a6ddb39\") " Oct 03 15:28:24 crc kubenswrapper[4636]: I1003 15:28:24.589280 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-utilities" (OuterVolumeSpecName: "utilities") pod "56346325-a21a-44bd-8f6d-01f18a6ddb39" (UID: "56346325-a21a-44bd-8f6d-01f18a6ddb39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:28:24 crc kubenswrapper[4636]: I1003 15:28:24.616525 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56346325-a21a-44bd-8f6d-01f18a6ddb39-kube-api-access-ld6qh" (OuterVolumeSpecName: "kube-api-access-ld6qh") pod "56346325-a21a-44bd-8f6d-01f18a6ddb39" (UID: "56346325-a21a-44bd-8f6d-01f18a6ddb39"). InnerVolumeSpecName "kube-api-access-ld6qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:28:24 crc kubenswrapper[4636]: I1003 15:28:24.691686 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld6qh\" (UniqueName: \"kubernetes.io/projected/56346325-a21a-44bd-8f6d-01f18a6ddb39-kube-api-access-ld6qh\") on node \"crc\" DevicePath \"\"" Oct 03 15:28:24 crc kubenswrapper[4636]: I1003 15:28:24.691722 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:28:24 crc kubenswrapper[4636]: I1003 15:28:24.697162 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56346325-a21a-44bd-8f6d-01f18a6ddb39" (UID: "56346325-a21a-44bd-8f6d-01f18a6ddb39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:28:24 crc kubenswrapper[4636]: I1003 15:28:24.793007 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56346325-a21a-44bd-8f6d-01f18a6ddb39-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.058186 4636 generic.go:334] "Generic (PLEG): container finished" podID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerID="973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05" exitCode=0 Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.058230 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p5n9" event={"ID":"56346325-a21a-44bd-8f6d-01f18a6ddb39","Type":"ContainerDied","Data":"973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05"} Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.058248 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p5n9" Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.058267 4636 scope.go:117] "RemoveContainer" containerID="973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05" Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.058256 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p5n9" event={"ID":"56346325-a21a-44bd-8f6d-01f18a6ddb39","Type":"ContainerDied","Data":"3d129a14af47fea7965232e98cd0f7ef7bc9fc84a67cc9e25895c7867f0bab40"} Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.077223 4636 scope.go:117] "RemoveContainer" containerID="900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941" Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.092400 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9p5n9"] Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.096848 4636 scope.go:117] "RemoveContainer" containerID="fdcb9953610d1217242d3b31dfd5cc37fa906ee9839385f14e421d10f685d773" Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.101178 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9p5n9"] Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.140300 4636 scope.go:117] "RemoveContainer" containerID="973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05" Oct 03 15:28:25 crc kubenswrapper[4636]: E1003 15:28:25.140788 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05\": container with ID starting with 973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05 not found: ID does not exist" containerID="973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05" Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.140821 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05"} err="failed to get container status \"973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05\": rpc error: code = NotFound desc = could not find container \"973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05\": container with ID starting with 973f71e3e313979ff83a98de4cdf74b94a0e55324e2045e315950eb8b69a5b05 not found: ID does not exist" Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.140844 4636 scope.go:117] "RemoveContainer" containerID="900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941" Oct 03 15:28:25 crc kubenswrapper[4636]: E1003 15:28:25.141054 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941\": container with ID starting with 900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941 not found: ID does not exist" containerID="900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941" Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.141073 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941"} err="failed to get container status \"900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941\": rpc error: code = NotFound desc = could not find container \"900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941\": container with ID starting with 900c677c3477f2b815be8412e7383890b637e3eca8c74f38b8f47ba18ccfa941 not found: ID does not exist" Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.141086 4636 scope.go:117] "RemoveContainer" containerID="fdcb9953610d1217242d3b31dfd5cc37fa906ee9839385f14e421d10f685d773" Oct 03 15:28:25 crc kubenswrapper[4636]: E1003 15:28:25.141312 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdcb9953610d1217242d3b31dfd5cc37fa906ee9839385f14e421d10f685d773\": container with ID starting with fdcb9953610d1217242d3b31dfd5cc37fa906ee9839385f14e421d10f685d773 not found: ID does not exist" containerID="fdcb9953610d1217242d3b31dfd5cc37fa906ee9839385f14e421d10f685d773" Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.141334 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdcb9953610d1217242d3b31dfd5cc37fa906ee9839385f14e421d10f685d773"} err="failed to get container status \"fdcb9953610d1217242d3b31dfd5cc37fa906ee9839385f14e421d10f685d773\": rpc error: code = NotFound desc = could not find container \"fdcb9953610d1217242d3b31dfd5cc37fa906ee9839385f14e421d10f685d773\": container with ID starting with fdcb9953610d1217242d3b31dfd5cc37fa906ee9839385f14e421d10f685d773 not found: ID does not exist" Oct 03 15:28:25 crc kubenswrapper[4636]: I1003 15:28:25.793947 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:28:25 crc kubenswrapper[4636]: E1003 15:28:25.794479 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:28:26 crc kubenswrapper[4636]: I1003 15:28:26.804973 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56346325-a21a-44bd-8f6d-01f18a6ddb39" path="/var/lib/kubelet/pods/56346325-a21a-44bd-8f6d-01f18a6ddb39/volumes" Oct 03 15:28:36 crc kubenswrapper[4636]: I1003 15:28:36.795260 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:28:36 crc kubenswrapper[4636]: E1003 15:28:36.797367 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:28:49 crc kubenswrapper[4636]: I1003 15:28:49.793659 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:28:49 crc kubenswrapper[4636]: E1003 15:28:49.794470 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.268830 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vnnsm"] Oct 03 15:28:54 crc kubenswrapper[4636]: E1003 15:28:54.271542 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerName="extract-content" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.271647 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerName="extract-content" Oct 03 15:28:54 crc kubenswrapper[4636]: E1003 15:28:54.271711 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerName="registry-server" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.271766 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerName="registry-server" Oct 03 15:28:54 crc kubenswrapper[4636]: E1003 15:28:54.271841 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerName="extract-utilities" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.271901 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerName="extract-utilities" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.272179 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="56346325-a21a-44bd-8f6d-01f18a6ddb39" containerName="registry-server" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.273575 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.287256 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnnsm"] Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.351706 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km9vr\" (UniqueName: \"kubernetes.io/projected/fedd80ed-dc7d-42f9-80fc-674098752d30-kube-api-access-km9vr\") pod \"certified-operators-vnnsm\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.351754 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-utilities\") pod \"certified-operators-vnnsm\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.351825 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-catalog-content\") pod \"certified-operators-vnnsm\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.453293 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km9vr\" (UniqueName: \"kubernetes.io/projected/fedd80ed-dc7d-42f9-80fc-674098752d30-kube-api-access-km9vr\") pod \"certified-operators-vnnsm\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.453347 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-utilities\") pod \"certified-operators-vnnsm\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.453864 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-utilities\") pod \"certified-operators-vnnsm\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.453924 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-catalog-content\") pod \"certified-operators-vnnsm\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.454239 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-catalog-content\") pod \"certified-operators-vnnsm\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.480839 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km9vr\" (UniqueName: \"kubernetes.io/projected/fedd80ed-dc7d-42f9-80fc-674098752d30-kube-api-access-km9vr\") pod \"certified-operators-vnnsm\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:28:54 crc kubenswrapper[4636]: I1003 15:28:54.593960 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:28:55 crc kubenswrapper[4636]: I1003 15:28:55.206672 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnnsm"] Oct 03 15:28:55 crc kubenswrapper[4636]: I1003 15:28:55.330050 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnnsm" event={"ID":"fedd80ed-dc7d-42f9-80fc-674098752d30","Type":"ContainerStarted","Data":"7a25d000f915cab72d95852f06d5ae48051d36d2204f7b6cc9613b8a67b35844"} Oct 03 15:28:56 crc kubenswrapper[4636]: I1003 15:28:56.340981 4636 generic.go:334] "Generic (PLEG): container finished" podID="fedd80ed-dc7d-42f9-80fc-674098752d30" containerID="ac1e855caccc177f1271e958eee8893cd759048a88fec0ee445c5123dd929bd0" exitCode=0 Oct 03 15:28:56 crc kubenswrapper[4636]: I1003 15:28:56.341643 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnnsm" event={"ID":"fedd80ed-dc7d-42f9-80fc-674098752d30","Type":"ContainerDied","Data":"ac1e855caccc177f1271e958eee8893cd759048a88fec0ee445c5123dd929bd0"} Oct 03 15:28:57 crc kubenswrapper[4636]: I1003 15:28:57.351874 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnnsm" event={"ID":"fedd80ed-dc7d-42f9-80fc-674098752d30","Type":"ContainerStarted","Data":"bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79"} Oct 03 15:28:59 crc kubenswrapper[4636]: I1003 15:28:59.370326 4636 generic.go:334] "Generic (PLEG): container finished" podID="fedd80ed-dc7d-42f9-80fc-674098752d30" containerID="bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79" exitCode=0 Oct 03 15:28:59 crc kubenswrapper[4636]: I1003 15:28:59.371179 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnnsm" event={"ID":"fedd80ed-dc7d-42f9-80fc-674098752d30","Type":"ContainerDied","Data":"bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79"} Oct 03 15:29:00 crc kubenswrapper[4636]: I1003 15:29:00.402012 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnnsm" event={"ID":"fedd80ed-dc7d-42f9-80fc-674098752d30","Type":"ContainerStarted","Data":"517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f"} Oct 03 15:29:00 crc kubenswrapper[4636]: I1003 15:29:00.421853 4636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vnnsm" podStartSLOduration=2.772359601 podStartE2EDuration="6.421830764s" podCreationTimestamp="2025-10-03 15:28:54 +0000 UTC" firstStartedPulling="2025-10-03 15:28:56.34406041 +0000 UTC m=+5286.202786667" lastFinishedPulling="2025-10-03 15:28:59.993531573 +0000 UTC m=+5289.852257830" observedRunningTime="2025-10-03 15:29:00.417853649 +0000 UTC m=+5290.276579896" watchObservedRunningTime="2025-10-03 15:29:00.421830764 +0000 UTC m=+5290.280557011" Oct 03 15:29:02 crc kubenswrapper[4636]: I1003 15:29:02.794668 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:29:02 crc kubenswrapper[4636]: E1003 15:29:02.795185 4636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ngmch_openshift-machine-config-operator(f078d6dd-d81e-4a06-aca1-508bf23a2170)\"" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" Oct 03 15:29:04 crc kubenswrapper[4636]: I1003 15:29:04.594812 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:29:04 crc kubenswrapper[4636]: I1003 15:29:04.594866 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:29:04 crc kubenswrapper[4636]: I1003 15:29:04.646461 4636 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:29:05 crc kubenswrapper[4636]: I1003 15:29:05.489930 4636 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:29:05 crc kubenswrapper[4636]: I1003 15:29:05.557298 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vnnsm"] Oct 03 15:29:07 crc kubenswrapper[4636]: I1003 15:29:07.458349 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vnnsm" podUID="fedd80ed-dc7d-42f9-80fc-674098752d30" containerName="registry-server" containerID="cri-o://517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f" gracePeriod=2 Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.462922 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.476758 4636 generic.go:334] "Generic (PLEG): container finished" podID="fedd80ed-dc7d-42f9-80fc-674098752d30" containerID="517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f" exitCode=0 Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.476826 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnnsm" event={"ID":"fedd80ed-dc7d-42f9-80fc-674098752d30","Type":"ContainerDied","Data":"517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f"} Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.476886 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnnsm" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.476924 4636 scope.go:117] "RemoveContainer" containerID="517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.476909 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnnsm" event={"ID":"fedd80ed-dc7d-42f9-80fc-674098752d30","Type":"ContainerDied","Data":"7a25d000f915cab72d95852f06d5ae48051d36d2204f7b6cc9613b8a67b35844"} Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.511914 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-catalog-content\") pod \"fedd80ed-dc7d-42f9-80fc-674098752d30\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.512038 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-utilities\") pod \"fedd80ed-dc7d-42f9-80fc-674098752d30\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.512144 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km9vr\" (UniqueName: \"kubernetes.io/projected/fedd80ed-dc7d-42f9-80fc-674098752d30-kube-api-access-km9vr\") pod \"fedd80ed-dc7d-42f9-80fc-674098752d30\" (UID: \"fedd80ed-dc7d-42f9-80fc-674098752d30\") " Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.514146 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-utilities" (OuterVolumeSpecName: "utilities") pod "fedd80ed-dc7d-42f9-80fc-674098752d30" (UID: "fedd80ed-dc7d-42f9-80fc-674098752d30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.519262 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fedd80ed-dc7d-42f9-80fc-674098752d30-kube-api-access-km9vr" (OuterVolumeSpecName: "kube-api-access-km9vr") pod "fedd80ed-dc7d-42f9-80fc-674098752d30" (UID: "fedd80ed-dc7d-42f9-80fc-674098752d30"). InnerVolumeSpecName "kube-api-access-km9vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.523064 4636 scope.go:117] "RemoveContainer" containerID="bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.560162 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fedd80ed-dc7d-42f9-80fc-674098752d30" (UID: "fedd80ed-dc7d-42f9-80fc-674098752d30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.569851 4636 scope.go:117] "RemoveContainer" containerID="ac1e855caccc177f1271e958eee8893cd759048a88fec0ee445c5123dd929bd0" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.610292 4636 scope.go:117] "RemoveContainer" containerID="517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f" Oct 03 15:29:08 crc kubenswrapper[4636]: E1003 15:29:08.610682 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f\": container with ID starting with 517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f not found: ID does not exist" containerID="517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.610720 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f"} err="failed to get container status \"517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f\": rpc error: code = NotFound desc = could not find container \"517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f\": container with ID starting with 517168777bbadc84bfd971dd76646c3ffc56642c56fd981e362708492031714f not found: ID does not exist" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.610746 4636 scope.go:117] "RemoveContainer" containerID="bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79" Oct 03 15:29:08 crc kubenswrapper[4636]: E1003 15:29:08.611448 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79\": container with ID starting with bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79 not found: ID does not exist" containerID="bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.611474 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79"} err="failed to get container status \"bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79\": rpc error: code = NotFound desc = could not find container \"bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79\": container with ID starting with bfd7751c9ac18e81e28cf3eb38ff2fdb1f916538a318c4d65088e32ff507de79 not found: ID does not exist" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.611495 4636 scope.go:117] "RemoveContainer" containerID="ac1e855caccc177f1271e958eee8893cd759048a88fec0ee445c5123dd929bd0" Oct 03 15:29:08 crc kubenswrapper[4636]: E1003 15:29:08.611740 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac1e855caccc177f1271e958eee8893cd759048a88fec0ee445c5123dd929bd0\": container with ID starting with ac1e855caccc177f1271e958eee8893cd759048a88fec0ee445c5123dd929bd0 not found: ID does not exist" containerID="ac1e855caccc177f1271e958eee8893cd759048a88fec0ee445c5123dd929bd0" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.611757 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac1e855caccc177f1271e958eee8893cd759048a88fec0ee445c5123dd929bd0"} err="failed to get container status \"ac1e855caccc177f1271e958eee8893cd759048a88fec0ee445c5123dd929bd0\": rpc error: code = NotFound desc = could not find container \"ac1e855caccc177f1271e958eee8893cd759048a88fec0ee445c5123dd929bd0\": container with ID starting with ac1e855caccc177f1271e958eee8893cd759048a88fec0ee445c5123dd929bd0 not found: ID does not exist" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.615185 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km9vr\" (UniqueName: \"kubernetes.io/projected/fedd80ed-dc7d-42f9-80fc-674098752d30-kube-api-access-km9vr\") on node \"crc\" DevicePath \"\"" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.615205 4636 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.615214 4636 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fedd80ed-dc7d-42f9-80fc-674098752d30-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.812990 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vnnsm"] Oct 03 15:29:08 crc kubenswrapper[4636]: I1003 15:29:08.822535 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vnnsm"] Oct 03 15:29:10 crc kubenswrapper[4636]: I1003 15:29:10.805667 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fedd80ed-dc7d-42f9-80fc-674098752d30" path="/var/lib/kubelet/pods/fedd80ed-dc7d-42f9-80fc-674098752d30/volumes" Oct 03 15:29:16 crc kubenswrapper[4636]: I1003 15:29:16.794210 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:29:17 crc kubenswrapper[4636]: I1003 15:29:17.563041 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"2d47f02c275bba17659c8e8a0b3d685f1274024e11c94cf0e4539f4569eaf05c"} Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.166874 4636 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln"] Oct 03 15:30:00 crc kubenswrapper[4636]: E1003 15:30:00.169518 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fedd80ed-dc7d-42f9-80fc-674098752d30" containerName="extract-content" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.169543 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fedd80ed-dc7d-42f9-80fc-674098752d30" containerName="extract-content" Oct 03 15:30:00 crc kubenswrapper[4636]: E1003 15:30:00.169737 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fedd80ed-dc7d-42f9-80fc-674098752d30" containerName="registry-server" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.169745 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fedd80ed-dc7d-42f9-80fc-674098752d30" containerName="registry-server" Oct 03 15:30:00 crc kubenswrapper[4636]: E1003 15:30:00.169779 4636 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fedd80ed-dc7d-42f9-80fc-674098752d30" containerName="extract-utilities" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.169799 4636 state_mem.go:107] "Deleted CPUSet assignment" podUID="fedd80ed-dc7d-42f9-80fc-674098752d30" containerName="extract-utilities" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.170092 4636 memory_manager.go:354] "RemoveStaleState removing state" podUID="fedd80ed-dc7d-42f9-80fc-674098752d30" containerName="registry-server" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.171211 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.174788 4636 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.189264 4636 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.190912 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln"] Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.224643 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-config-volume\") pod \"collect-profiles-29325090-xf9ln\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.224806 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-secret-volume\") pod \"collect-profiles-29325090-xf9ln\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.224895 4636 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhkjh\" (UniqueName: \"kubernetes.io/projected/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-kube-api-access-qhkjh\") pod \"collect-profiles-29325090-xf9ln\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.327177 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-config-volume\") pod \"collect-profiles-29325090-xf9ln\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.327317 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-secret-volume\") pod \"collect-profiles-29325090-xf9ln\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.327546 4636 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhkjh\" (UniqueName: \"kubernetes.io/projected/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-kube-api-access-qhkjh\") pod \"collect-profiles-29325090-xf9ln\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.329037 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-config-volume\") pod \"collect-profiles-29325090-xf9ln\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.333501 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-secret-volume\") pod \"collect-profiles-29325090-xf9ln\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.346826 4636 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhkjh\" (UniqueName: \"kubernetes.io/projected/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-kube-api-access-qhkjh\") pod \"collect-profiles-29325090-xf9ln\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.500141 4636 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:00 crc kubenswrapper[4636]: I1003 15:30:00.982274 4636 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln"] Oct 03 15:30:01 crc kubenswrapper[4636]: I1003 15:30:01.975790 4636 generic.go:334] "Generic (PLEG): container finished" podID="2ade86b0-6931-4a4b-9adb-c2f7ee11e74d" containerID="ab388b450108c83e487870e021fa359c954f0d92cbe69fd98aafbcdbf5da9284" exitCode=0 Oct 03 15:30:01 crc kubenswrapper[4636]: I1003 15:30:01.975850 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" event={"ID":"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d","Type":"ContainerDied","Data":"ab388b450108c83e487870e021fa359c954f0d92cbe69fd98aafbcdbf5da9284"} Oct 03 15:30:01 crc kubenswrapper[4636]: I1003 15:30:01.976377 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" event={"ID":"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d","Type":"ContainerStarted","Data":"1f6961a7eae4db33cdc2140908a2bab1282d01301a770b35dea14e60a9a5eda9"} Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.316707 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.396049 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-config-volume\") pod \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.396125 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhkjh\" (UniqueName: \"kubernetes.io/projected/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-kube-api-access-qhkjh\") pod \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.396287 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-secret-volume\") pod \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\" (UID: \"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d\") " Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.397042 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-config-volume" (OuterVolumeSpecName: "config-volume") pod "2ade86b0-6931-4a4b-9adb-c2f7ee11e74d" (UID: "2ade86b0-6931-4a4b-9adb-c2f7ee11e74d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.403321 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-kube-api-access-qhkjh" (OuterVolumeSpecName: "kube-api-access-qhkjh") pod "2ade86b0-6931-4a4b-9adb-c2f7ee11e74d" (UID: "2ade86b0-6931-4a4b-9adb-c2f7ee11e74d"). InnerVolumeSpecName "kube-api-access-qhkjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.405029 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2ade86b0-6931-4a4b-9adb-c2f7ee11e74d" (UID: "2ade86b0-6931-4a4b-9adb-c2f7ee11e74d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.498344 4636 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.498375 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhkjh\" (UniqueName: \"kubernetes.io/projected/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-kube-api-access-qhkjh\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.498384 4636 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ade86b0-6931-4a4b-9adb-c2f7ee11e74d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.995762 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" event={"ID":"2ade86b0-6931-4a4b-9adb-c2f7ee11e74d","Type":"ContainerDied","Data":"1f6961a7eae4db33cdc2140908a2bab1282d01301a770b35dea14e60a9a5eda9"} Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.995808 4636 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f6961a7eae4db33cdc2140908a2bab1282d01301a770b35dea14e60a9a5eda9" Oct 03 15:30:03 crc kubenswrapper[4636]: I1003 15:30:03.995870 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325090-xf9ln" Oct 03 15:30:04 crc kubenswrapper[4636]: I1003 15:30:04.421587 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7"] Oct 03 15:30:04 crc kubenswrapper[4636]: I1003 15:30:04.428716 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325045-gqkt7"] Oct 03 15:30:04 crc kubenswrapper[4636]: I1003 15:30:04.807647 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6ce88c-287d-4151-b6ab-5c36bb092862" path="/var/lib/kubelet/pods/1c6ce88c-287d-4151-b6ab-5c36bb092862/volumes" Oct 03 15:30:11 crc kubenswrapper[4636]: I1003 15:30:11.617320 4636 scope.go:117] "RemoveContainer" containerID="99f6d2ac7a43a2363484d4f1e02e32a5b28030ec809b504acb8388411f23f5b0" Oct 03 15:30:11 crc kubenswrapper[4636]: I1003 15:30:11.662503 4636 scope.go:117] "RemoveContainer" containerID="e290993850c0c5ca851f80f706db67ad321a11fe4949e7ec53deb57461433959" Oct 03 15:30:48 crc kubenswrapper[4636]: I1003 15:30:48.392756 4636 generic.go:334] "Generic (PLEG): container finished" podID="472a5287-7186-4eb6-8bd6-ce986291af01" containerID="3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae" exitCode=0 Oct 03 15:30:48 crc kubenswrapper[4636]: I1003 15:30:48.392856 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lcgn5/must-gather-77vjr" event={"ID":"472a5287-7186-4eb6-8bd6-ce986291af01","Type":"ContainerDied","Data":"3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae"} Oct 03 15:30:48 crc kubenswrapper[4636]: I1003 15:30:48.393862 4636 scope.go:117] "RemoveContainer" containerID="3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae" Oct 03 15:30:48 crc kubenswrapper[4636]: I1003 15:30:48.763203 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lcgn5_must-gather-77vjr_472a5287-7186-4eb6-8bd6-ce986291af01/gather/0.log" Oct 03 15:31:01 crc kubenswrapper[4636]: I1003 15:31:01.983260 4636 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lcgn5/must-gather-77vjr"] Oct 03 15:31:01 crc kubenswrapper[4636]: I1003 15:31:01.984055 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lcgn5/must-gather-77vjr" podUID="472a5287-7186-4eb6-8bd6-ce986291af01" containerName="copy" containerID="cri-o://b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50" gracePeriod=2 Oct 03 15:31:01 crc kubenswrapper[4636]: I1003 15:31:01.993043 4636 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lcgn5/must-gather-77vjr"] Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.482126 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lcgn5_must-gather-77vjr_472a5287-7186-4eb6-8bd6-ce986291af01/copy/0.log" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.482798 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/must-gather-77vjr" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.511380 4636 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lcgn5_must-gather-77vjr_472a5287-7186-4eb6-8bd6-ce986291af01/copy/0.log" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.511966 4636 generic.go:334] "Generic (PLEG): container finished" podID="472a5287-7186-4eb6-8bd6-ce986291af01" containerID="b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50" exitCode=143 Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.512032 4636 scope.go:117] "RemoveContainer" containerID="b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.512238 4636 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lcgn5/must-gather-77vjr" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.539446 4636 scope.go:117] "RemoveContainer" containerID="3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.557455 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/472a5287-7186-4eb6-8bd6-ce986291af01-must-gather-output\") pod \"472a5287-7186-4eb6-8bd6-ce986291af01\" (UID: \"472a5287-7186-4eb6-8bd6-ce986291af01\") " Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.558141 4636 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7xsb\" (UniqueName: \"kubernetes.io/projected/472a5287-7186-4eb6-8bd6-ce986291af01-kube-api-access-b7xsb\") pod \"472a5287-7186-4eb6-8bd6-ce986291af01\" (UID: \"472a5287-7186-4eb6-8bd6-ce986291af01\") " Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.566347 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472a5287-7186-4eb6-8bd6-ce986291af01-kube-api-access-b7xsb" (OuterVolumeSpecName: "kube-api-access-b7xsb") pod "472a5287-7186-4eb6-8bd6-ce986291af01" (UID: "472a5287-7186-4eb6-8bd6-ce986291af01"). InnerVolumeSpecName "kube-api-access-b7xsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.659275 4636 scope.go:117] "RemoveContainer" containerID="b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50" Oct 03 15:31:02 crc kubenswrapper[4636]: E1003 15:31:02.660378 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50\": container with ID starting with b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50 not found: ID does not exist" containerID="b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.660420 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50"} err="failed to get container status \"b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50\": rpc error: code = NotFound desc = could not find container \"b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50\": container with ID starting with b480660671849de28de99e80e9a4f522a2c5418b45dfca5e70720db50b91af50 not found: ID does not exist" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.660448 4636 scope.go:117] "RemoveContainer" containerID="3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.661859 4636 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7xsb\" (UniqueName: \"kubernetes.io/projected/472a5287-7186-4eb6-8bd6-ce986291af01-kube-api-access-b7xsb\") on node \"crc\" DevicePath \"\"" Oct 03 15:31:02 crc kubenswrapper[4636]: E1003 15:31:02.662181 4636 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae\": container with ID starting with 3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae not found: ID does not exist" containerID="3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.662207 4636 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae"} err="failed to get container status \"3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae\": rpc error: code = NotFound desc = could not find container \"3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae\": container with ID starting with 3a86c22f5b5b97e3656b9944cc65bba34ca8506cc3e6194c448d9dc6063cf2ae not found: ID does not exist" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.788941 4636 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472a5287-7186-4eb6-8bd6-ce986291af01-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "472a5287-7186-4eb6-8bd6-ce986291af01" (UID: "472a5287-7186-4eb6-8bd6-ce986291af01"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.803984 4636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472a5287-7186-4eb6-8bd6-ce986291af01" path="/var/lib/kubelet/pods/472a5287-7186-4eb6-8bd6-ce986291af01/volumes" Oct 03 15:31:02 crc kubenswrapper[4636]: I1003 15:31:02.865586 4636 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/472a5287-7186-4eb6-8bd6-ce986291af01-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 15:31:39 crc kubenswrapper[4636]: I1003 15:31:39.163055 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:31:39 crc kubenswrapper[4636]: I1003 15:31:39.163710 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:32:09 crc kubenswrapper[4636]: I1003 15:32:09.162684 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:32:09 crc kubenswrapper[4636]: I1003 15:32:09.163263 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:32:39 crc kubenswrapper[4636]: I1003 15:32:39.163336 4636 patch_prober.go:28] interesting pod/machine-config-daemon-ngmch container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 15:32:39 crc kubenswrapper[4636]: I1003 15:32:39.163861 4636 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 15:32:39 crc kubenswrapper[4636]: I1003 15:32:39.163908 4636 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" Oct 03 15:32:39 crc kubenswrapper[4636]: I1003 15:32:39.164747 4636 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d47f02c275bba17659c8e8a0b3d685f1274024e11c94cf0e4539f4569eaf05c"} pod="openshift-machine-config-operator/machine-config-daemon-ngmch" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 15:32:39 crc kubenswrapper[4636]: I1003 15:32:39.164811 4636 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" podUID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerName="machine-config-daemon" containerID="cri-o://2d47f02c275bba17659c8e8a0b3d685f1274024e11c94cf0e4539f4569eaf05c" gracePeriod=600 Oct 03 15:32:39 crc kubenswrapper[4636]: I1003 15:32:39.371703 4636 generic.go:334] "Generic (PLEG): container finished" podID="f078d6dd-d81e-4a06-aca1-508bf23a2170" containerID="2d47f02c275bba17659c8e8a0b3d685f1274024e11c94cf0e4539f4569eaf05c" exitCode=0 Oct 03 15:32:39 crc kubenswrapper[4636]: I1003 15:32:39.371747 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerDied","Data":"2d47f02c275bba17659c8e8a0b3d685f1274024e11c94cf0e4539f4569eaf05c"} Oct 03 15:32:39 crc kubenswrapper[4636]: I1003 15:32:39.371783 4636 scope.go:117] "RemoveContainer" containerID="5d44c95b79a0078ad10d491d031ed7f6e95ee2fb87adb1705ab75a673f10adf2" Oct 03 15:32:40 crc kubenswrapper[4636]: I1003 15:32:40.383362 4636 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ngmch" event={"ID":"f078d6dd-d81e-4a06-aca1-508bf23a2170","Type":"ContainerStarted","Data":"22ae5247f51eeff2cd462339987b37afdd05e2aac260780d62630c764bd73ce8"}